Note: You are looking at a static snapshot of documentation related to Robot Framework automations. The most recent documentation is at https://robocorp.com/docs

Sharing robot code and libraries

One of the choices you will need to make when developing your automation with Robocorp is how to structure your code into robots and tasks. As robots are just normal code, you can leverage many different methods and patterns that have been vetted in the decades of software development. In this article, we describe our recommendations for a few usual cases:

Share robot code inside a robot that contains multiple tasks

Creating a robot that handles multiple tasks around the same topic is the easiest way to share and contain robot functionality. Here you are creating a single robot that has multiple tasks defined in robot.yaml, and the implementation of these tasks are inside the same robot folder so they can share common code directly. These kinds of robots are also easy to make reusable in multiple processes.

For example, if you are developing a robot that interacts with a web application, one of the first steps will be adding functionality to log in and out of the system. If you need to create another automated process that targets that same application, it is much more efficient to reuse that functionality instead of duplicating it.

In these cases, the best solution is to create a single robot, containing as many tasks as you need. This way, part of the functionality and the library dependencies can easily be shared and reused when running the individual task locally or in Control Room. You can add as many tasks as you need in the robot's robot.yaml file.

Example

Here is a simple example of how you can structure your robot in this case:

tasks.robot:

*** Keywords *** Keyword used by multiple tasks [Arguments] ${message} Log ${message} *** Task *** Task A Keyword used by multiple tasks Message from Task A *** Task *** Task B Keyword used by multiple tasks Message from Task B

robot.yaml:

tasks: Task A: robotTaskName: Task A Task B: robotTaskName: Task B condaConfigFile: conda.yaml artifactsDir: output PATH: - . PYTHONPATH: - . ignoreFiles: - .gitignore

Note: We are showing here the simplest possible example. You can also have your tasks inside their own files, add the shared keywords to a resource file, have the shared code as a custom Python library, etc.
So with this method, you are able to leverage the other code sharing methods in this article.

Learn more about the robot structure and configuration and the robot.yaml configuration file format.

Share robot code via a common repository and Git

If you have two robots that automate different target systems (for example, a CRM and SAP), and you want them to use shared resources, you have several different possibilities.

In small projects, you might end up just using copy and paste to move the common code folders around. While this works, it is not a best practice and should be avoided if possible. Maintaining code that is copied to multiple locations gets hard fast... so there is a Star Wars quote about the dark side hiding here 😉

Our recommendation is to use Git for this. Setting up a common code repository maintained in a single location is quite easy to start and provides a solid foundation for maintenance and usage.

We have public example repositories that have the structure and scripts ready to go:

Process of shared code over common repository

  1. Common code is created and maintained in a separate Git repository
  2. When the common code is updated, the developers run a release script
    • Give the release a version number semver
    • Write a simple release note telling the other robot developer what's new and changed.
    • Notify your robot developers that there's new stuff available
  3. To get the common code to the "client repositories", the robot developer only needs to run the update -script
    • The script asks for the version number and then does the needed "git magic".
    • The script uses git subtree to clone the given version of the common code into the robots folder, but robot developers do not need to know this.
    • The critical thing is that the robot developer needs to have control over when to update, as they are the one who knows when updates can be done and how to test their robot; automatically pushing updates is not recommended but can of course be done.

🎉 The shared codes are now updated in the robots folder and can be used 🎉 If the common repository follows our examples, the robot developer also has nice release notes documentation on what has changed and what's new right in their robot repository.

Why not use git submodules?
The reason why we use git subtree is the fact that it is simpler. All the files are copied, so packaging robots and just seeing the files in the "client repository" is more straightforward. Most git integrations also fail at submodules, so we just avoid it.

Share robot code via private PyPI libraries

Creating libraries out of your common code is the most official way of doing things. This way, you can leverage conda.yml in the client robots to control your dependencies. This also usually requires some private services like a private PyPI or conda repository service.

Sharing parts of the code as PyPI or Conda package

Suppose the shared code between your robots is Python code. In that case, you can publish the code as public code to Python Package Index (PyPI) package; you can then just refer to it in your robot's conda.yaml file as a dependency as you would do with any other library. Note that PyPI is a public repository, and anyone will have access to your shared code.

You can also set up your private PyPI repository and conda channels. Most "artifactory" -tools provide these out-of-the-box:

To run a private PyPI package in Control Room, you will need to set up the necessary credentials in Vault and include in the robot.yaml a preRunScript that is used to download the package from your private repository securely. The example repository shows the recommended pattern.

In addition to PyPI, Robocorp supports and encourages using Conda. You can create your package and add it to the public Conda Forge channel or run your own custom channel.

Again most "artifactory" -tools provide this out-of-the-box:

When using PyPI or Conda, you are responsible for specifying and keeping the package version of your shared module up-to-date in each of the robots that use it.

Last edit: September 29, 2022