gRPC API packages#
Protobuf service definitions provide the API specification for underlying
server implementations so that each consuming client library has a clear
contract for gRPC data messages. Ideally, the Protobuf (.proto
) files
have a single repository established as the source of truth, organized by
API version increment as the API definition expands and changes. Because
most client libraries are custom implementations enhancing the developer
experience when consuming the service, releasing the Protobuf definitions
publicly gives full flexibility to developers to operate at the abstraction
layer they choose.
Maintain API definition repository#
Because the Protobuf definition of the service is language agnostic, the repository containing the PROTO files can be created within the top-level Ansys GitHub organization.
Every update of the PROTO files follows a standard pull request process as a sanity check for API definition accuracy. Language-specific packages can be generated for each merge or on a set cadence.
Manage Protobuf definitions for Python clients#
Within Ansys, and more specifically in the PyAnsys environment, most client libraries have a dedicated Python package containing the needed PROTO files compiled as Python source code. These are typically consumed by the PyAnsys client libraries for communicating with their respective services.
For example, PyMAPDL consumes the ansys-api-mapdl
package, which is built in the
ansys-api-mapdl repository.
Build an ansys-api-<service>
repository#
The Ansys GitHub organization has a dedicated template repository for creating PROTO file repositories and the needed files to generate the Python API packages to be consumed by the PyAnsys clients.
To set up an API repository like the ansys-api-mapdl
one,
select the ansys-api-template repository
when creating a repository within the Ansys GitHub organization.
To understand how to use the ansys-api-template
repository, see
Expected usage
in this repository’s README.
Build Python stub classes#
The ansys-api-template
repository uses the ansys-tools-protoc-helper
utility to auto-generate Python wheels that can be consumed by downstream Python client libraries.
To use the ansys-tools-protoc-helper
utility, include it in the pyproject.toml
file as a build dependency:
[build-system]
requires = ["setuptools >= 42.0.0", "wheel", "ansys_tools_protoc_helper"]
Then generate a Python wheel containing the autogenerated Python source with these commands:
pip install -U pip
pip install build
python -m build
Publish the API package#
PyPI is the common package manager where API packages are released.
Here is an example of a workflow pipeline for building and publishing the Python stub package.
In this example, the ansys-api-geometry
workflow is shown. However, you can easily copy
and adapt this workflow. Only the PYTHON_PACKAGE_IMPORT
environment variable would have
to be changed:
name: GitHub CI
on:
pull_request:
push:
tags:
- "*"
branches:
- main
env:
MAIN_PYTHON_VERSION: '3.10'
PYTHON_PACKAGE_IMPORT: 'ansys.api.geometry.v0'
jobs:
build:
name: Build package
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: ${{ env.MAIN_PYTHON_VERSION }}
- name: Install build requirements
run: |
pip install -U pip
pip install build
- name: Build
run: python -m build
- name: Install
run: pip install dist/*.whl
- name: Test import
run: |
mkdir tmp
cd tmp
python -c "import ${{ env.PYTHON_PACKAGE_IMPORT }}; print('Successfully imported ${{ env.PYTHON_PACKAGE_IMPORT }}')"
python -c "from import __version__; print(__version__)"
- name: Upload packages
uses: actions/upload-artifact@v3
with:
name: ansys-api-package
path: dist/
retention-days: 7
release:
name: Release package
if: github.event_name == 'push' && contains(github.ref, 'refs/tags')
needs: [build]
runs-on: ubuntu-latest
steps:
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: ${{ env.MAIN_PYTHON_VERSION }}
- uses: actions/download-artifact@v3
- name: Display structure of downloaded files
run: ls -R
- name: Upload to Public PyPi
run: |
pip install twine
twine upload --skip-existing ./**/*.whl
twine upload --skip-existing ./**/*.tar.gz
env:
TWINE_USERNAME: __token__
TWINE_PASSWORD: ${{ secrets.PYPI_TOKEN }}
- name: Release
uses: softprops/action-gh-release@v1
with:
generate_release_notes: true
files: |
./**/*.whl
./**/*.tar.gz
./**/*.pdf
Version the API package#
PyPI packages follow semantic versioning while gRPC Protobuf API versions
typically follow a simplified v*
versioning pattern. The PyPI package
version is not expected to synchronize with the Protobuf API version, and
multiple public APIs can be exposed simultaneously. For example, if you have a
v0
for MAPDL exposed, you can access it with this code:
from ansys.api.mapdl.v0 import mapdl_pb2_grpc
While if the API has a v1
API exposed, a different library could also use:
from ansys.api.mapdl.v1 import mapdl_pb2_grpc
Ansys follows Microsoft’s gRPC versioning recommendations, which stipulate that incrementing the gRPC Protobuf version is only necessary when making a backwards breaking change. Non-breaking changes include:
Adding a service
Adding a method to a service
Adding a field to a request message
However, this only applies to the vN
gRPC Protobuf API. Python packages
tend to follow semantic versioning, and PyAnsys packages follow this
approach. Therefore, these Python gRPC API packages should also follow semantic
versioning.
Plan on releasing a new minor version when adding or removing features, messages, and services.
Plan on releasing a patch release when fixing bugs that do not change the behavior of the API.
Only plan on releasing a major release once the API is stable and no major release is scheduled in the near future.
This way, you can expose a v0
and/or v1
gRPC Protobuf API and release
frequent updates using semantic versioning.
Release the API package#
As shown in the release
section of the previous GitHub workflow, once the Python
API package is compiled it is then uploaded to the public PyPI. In order to do
so, it is necessary to have access to the PYPI_TOKEN
within the GitHub
repository. To get the needed credentials, contact the
PyAnsy core team.
If the repository cannot be uploaded to the public PyPI yet but your Python
client library needs to consume this Python API package, it can be
uploaded to the private PyAnsys PyPI. For the required PYANSYS_PYPI_PRIVATE_PAT
password, contact the PyAnsy core team.
In this last case, the Upload to Public PyPi
workflow section should be
replaced with the Upload to Private PyPi
workflow section:
- name: Upload to Private PyPi
run: |
pip install twine
twine upload --skip-existing ./**/*.whl
twine upload --skip-existing ./**/*.tar.gz
env:
TWINE_USERNAME: PAT
TWINE_PASSWORD: ${{ secrets.PYANSYS_PYPI_PRIVATE_PAT }}
TWINE_REPOSITORY_URL: https://pkgs.dev.azure.com/pyansys/_packaging/pyansys/pypi/upload
Consume the API package within Python#
Once the API package has been published to PyPI, you can include a reference within the client library build dependencies. For information on how to specify a project’s required dependencies, see Required dependencies.
Use the API package within the Python client#
The stub imports follow a standard pattern. For each API service, there is a *_pb2
module that defines all messages within a specific service file and
a *_pb2_grpc
module that defines a Stub
class that encapsulates all service methods.
Example gRPC imports within the wrapping client library#
from ansys.api.geometry.v0.designs_pb2 import (
ExportDesignRequest,
NewDesignRequest,
SaveAsDocumentRequest,
)
from ansys.api.geometry.v0.designs_pb2_grpc import DesignsStub
The best practice is to create a Pythonic client library that organizes the service methods in a user-friendly manner. At a minimum, this library should act as a facade layer wrapping the service calls so that the Pythonic API can have a consistent abstraction, independent of underlying implementations.
For each client library release, only a single gRPC API version should be wrapped to maintain a consistent API abstraction expectation for the supporting server instances.
Public versus private Python API package#
Making the PROTO files for a public or private repository is up to the owner of each repository.
In terms of intellectual property (IP) concerns, the PROTO files are typically not an issue because they do not expose any critical service logic or knowledge. In most cases, the APIs being exposed through the PROTO files are already exposed publicly through other mechanisms.
Thus, the general recommendation is to make these repositories public as soon as possible. The main reasons for doing so follow:
Private Python package dependencies usually involve workarounds when setting up the workflow. It is best to keep the workflows as standard and simple as possible. That implies making all its dependencies public, including this API Python package.
The API Python package generated eventually must be uploaded to the public PyPI so that it can be consumed by its corresponding Python client library (when it is publicly released). So, if there are no issues with making it public, it is better to do so sooner rather than later.
Once the Python API package is publicly released to PyPI, there is no reason to keep the repository private because all users who consume the Python API package have direct access to the PROTO files that are in the repository.
However, before making any repository public in the Ansys GitHub organization, review the Ansys Open Source Developer’s Guide to verify that the repository is compliant with all the needed requirements.