gRPC API packages#
Protobuf service definitions provide the API specification for underlying
server implementations so that each consuming client library has a clear
contract for gRPC data messages. Ideally, the .proto
files have a single
repository established as the source of truth, organized by API version
increment as the API definition expands and changes. Because most client
libraries are custom implementations enhancing the developer experience
when consuming the service, releasing the Protobuf definitions
publicly gives full flexibility to developers to operate at the abstraction
layer they choose.
Maintain API definition repository#
Because the Protobuf definition of the service is language agnostic, the repository containing the Protobuf files can be created within the top-level Ansys GitHub organization. Every update of the Protobuf files follows a standard pull request process as a sanity check for API definition accuracy. Language- specific packages can be generated for each merge or on a set cadence.
Managing Protobuf definitions for Python clients#
Within Ansys, and more specifically in the PyAnsys environment, most client libraries
have a dedicated Python package containing the needed .proto
files compiled as
Python source code. These are typically consumed by the PyAnsys client libraries
for being able to communicate with their respective services.
For example, PyMAPDL consumes the
ansys-api-mapdl
package, which is built in the
ansys-api-mapdl repository.
How to build an ansys-api-<service>
repository#
The Ansys GitHub organization has a dedicated template repository for creating
these .proto
file repositories and the needed files to generate the Python API
packages to be consumed by the PyAnsys clients.
In order to set up an API repository like ansys-api-mapdl, select the ansys-api-template repository when creating a new repository within the Ansys GitHub organization.
Follow the instructions on the ansys-api-template - Expected usage section to understand how to use the template repository.
Building Python stub classes#
The template repository uses the ansys-tools-protoc-helper library to auto-generate Python wheels that can be consumed by downstream Python client libraries.
To use this, include this tool in the pyproject.toml
file as a build dependency:
[build-system]
requires = ["setuptools >= 42.0.0", "wheel", "ansys_tools_protoc_helper"]
Then generate a Python wheel containing the autogenerated Python source with:
pip install -U pip
pip install build
python -m build
Publishing Python API package#
PyPI is the common package manager where API packages are released.
Here is an example of a workflow pipeline for building and publishing the Python stub package.
In this example, the ansys-api-geometry
workflow is shown. However, this workflow can be
easily copied and adapted. Only the PYTHON_PACKAGE_IMPORT
environment variable
would have to be changed:
name: GitHub CI
on:
pull_request:
push:
tags:
- "*"
branches:
- main
env:
MAIN_PYTHON_VERSION: '3.10'
PYTHON_PACKAGE_IMPORT: 'ansys.api.geometry.v0'
jobs:
build:
name: Build package
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: ${{ env.MAIN_PYTHON_VERSION }}
- name: Install build requirements
run: |
pip install -U pip
pip install build
- name: Build
run: python -m build
- name: Install
run: pip install dist/*.whl
- name: Test import
run: |
mkdir tmp
cd tmp
python -c "import ${{ env.PYTHON_PACKAGE_IMPORT }}; print('Successfully imported ${{ env.PYTHON_PACKAGE_IMPORT }}')"
python -c "from import __version__; print(__version__)"
- name: Upload packages
uses: actions/upload-artifact@v3
with:
name: ansys-api-package
path: dist/
retention-days: 7
release:
name: Release package
if: github.event_name == 'push' && contains(github.ref, 'refs/tags')
needs: [build]
runs-on: ubuntu-latest
steps:
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: ${{ env.MAIN_PYTHON_VERSION }}
- uses: actions/download-artifact@v3
- name: Display structure of downloaded files
run: ls -R
- name: Upload to Public PyPi
run: |
pip install twine
twine upload --skip-existing ./**/*.whl
twine upload --skip-existing ./**/*.tar.gz
env:
TWINE_USERNAME: __token__
TWINE_PASSWORD: ${{ secrets.PYPI_TOKEN }}
- name: Release
uses: softprops/action-gh-release@v1
with:
generate_release_notes: true
files: |
./**/*.whl
./**/*.tar.gz
./**/*.pdf
Versioning#
PyPI packages follow semantic versioning while gRPC Protobuf API versions
typically follow a simplified v*
versioning pattern. It is not expected to
synchronize the PyPI package version with the Protobuf API version, and
multiple public APIs can be exposed simultaneously. For example, if you have a
v0
for MAPDL exposed, you can access it via:
from ansys.api.mapdl.v0 import mapdl_pb2_grpc
While if the API has a v1
API exposed, a different library could also use:
from ansys.api.mapdl.v1 import mapdl_pb2_grpc
Ansys follows Microsoft’s gRPC versioning recommendations which stipulate that incrementing the gRPC Protobuf version is only necessary when making a backwards breaking change. Non-breaking changes include:
Adding a new service
Adding a new method to a service
Adding a field to a request message
However, this only applies to the vN
gRPC Protobuf API. Python packages
tend to follow semantic versioning, and PyAnsys packages follow that
approach. Therefore, these Python gRPC API packages should also follow semantic
versioning. Plan on releasing a new minor version when:
Adding or removing features, messages, services, etc.
Release a patch release when:
Fixing bugs that do not change the behavior of the API.
Only plan on releasing a major release once the API is stable and you plan no major in the near future.
This way, you can expose a v0
and/or v1
gRPC Protobuf API and release
frequent updates using semantic versioning.
Releasing#
As shown in the release
section of GitHub workflow, once the Python
API package is compiled it is then uploaded to the public PyPI. In order to do
so, it is necessary to have access to the PYPI_TOKEN
within the GitHub
repository. Please contact the PyAnsys Core team at pyansys.core@ansys.com in order to get the needed credentials.
If the repository cannot be uploaded to the public PyPI yet, but your Python
client library needs to consume this Python API package, it can also be
uploaded to the private PyAnsys PyPI. Email the PyAnsys Core team at
pyansys.core@ansys.com for the required PYANSYS_PYPI_PRIVATE_PAT
password.
In this last case, the workflow section Upload to Public PyPi
should be
replaced by:
- name: Upload to Private PyPi
run: |
pip install twine
twine upload --skip-existing ./**/*.whl
twine upload --skip-existing ./**/*.tar.gz
env:
TWINE_USERNAME: PAT
TWINE_PASSWORD: ${{ secrets.PYANSYS_PYPI_PRIVATE_PAT }}
TWINE_REPOSITORY_URL: https://pkgs.dev.azure.com/pyansys/_packaging/pyansys/pypi/upload
Consuming the API package within Python#
Once the API package has been published to PyPI, a reference can be included within the client library build dependencies. To know how to specify project dependencies, see Required dependencies.
Using the API package within the Python client#
The stub imports follow a standard pattern. For each API service, there is a *_pb2
module that defines all messages within a specific service file and
a *_pb2_grpc
module that defines a Stub
class that encapsulates all service methods.
Example gRPC imports within the wrapping client library#
from ansys.api.geometry.v0.designs_pb2 import (
ExportDesignRequest,
NewDesignRequest,
SaveAsDocumentRequest,
)
from ansys.api.geometry.v0.designs_pb2_grpc import DesignsStub
The best practice is to create a Pythonic client library that organizes the service methods in a user-friendly manner. At a minimum, this library should act as a facade layer wrapping the service calls so that the Pythonic API can have a consistent abstraction, independent of underlying implementations.
For each client library release, only a single gRPC API version should be wrapped to maintain a consistent API abstraction expectation for the supporting server instances.
Public vs private Python API package#
Making these .proto
files repositories public or private is up to the owner of each repository.
In terms of intellectual property (IP) concerns, the .proto
files are typically not an
issue since they do not expose any critical service logic or knowledge - and in most cases
the APIs being exposed through the .proto
files are already exposed through other
mechanisms publicly.
Thus, the general recommendation is to make these repositories public as soon as possible. The main reasons behind are:
Private Python package dependencies usually involve workarounds when setting up the workflow. It is best to keep the workflows as standard and simple as possible. That implies making all its dependencies public - including this API Python package.
The API Python package generated eventually has to be uploaded to the public PyPI, so that it can be consumed by its corresponding Python client library (when it is publicly released). So, better make it public sooner than later if there are no issues with it.
Once the Python API package is publicly released to PyPI, there is no reason behind keeping the repository private since all users which consume the Python API package have direct access to the
.proto
files that are in the repository.
However, before making any repository public with the Ansys GitHub organization, please review the Ansys open-source guide documentation to verify that the repository is compliant with all the needed requirements.