System requirements
The following are the systems that Robocorp tests on. Other operating systems might also work, but we cannot guarantee that.
Windows
It is recommended that Windows systems be kept up-to-date according to Microsoft release information in the links below, as older versions always have potential security concerns.
For development tasks, the user usually needs some elevated permissions (for installing the software), but in Robocorp applications, we avoid requiring admin permissions while running.
Windows 10 and 11
- Professional and Enterprise editions.
- Windows 11 current versions
- Windows 10 current versions
- Version 20H2 (build 19042/1909) or later is required due to long path support.
Windows Server
- Windows Server 2019 version or later with long path support enabled.
macOS
There is no clear statement on the versions of macOS that are officially supported, but the majority rule from Apple seems to be
the latest two releases are supported
, so we aim to adhere to that as well.
👉 Detailed list of macOS versions and revisions
For ARM architecture machines (M1/M2/M3), the support of many tools and libraries is still very limited, so we support those via Rosetta 2 translation layer.
Linux
Supported Linux distributions are specified per component. Other Linux distributions may work just fine, but our standard support agreement covers the following.
Development using Visual Studio Code on Linux
- Debian 12 "bookworm"
- Ubuntu 22.04
Linux container for running unattended automations via Robocorp Worker
- Debian 12 "bookworm"
- Note! Ubuntu 22.04 is NOT supported with unattended Chrome browser automations due to Chrome being distributed as a Snap package
Assistant and Setup Utility on Linux
- Technical support is provided only under an extended support contract. Get in touch with your account manager if interested.
Web browsers
- The latest version of Chrome is the recommended browser.
- Tools and libraries also support other browsers, but we limit our tests to Chrome.
Minimum system requirements
Minimum system requirements are always subjective because even a low-end system can edit and execute robots.
For Developers
Developers creating the automation should always have a capable machine because there is nothing worse than wasting development time due to hardware limitations.
👉 At least 6 cores, 16GB of memory, 40GB of free disk space on a fast disk (SSD)
- The Robocorp toolchain is designed to utilize multicore CPUs, so more logical cores are always better.
- For example, virus scanners commonly utilize one CPU core.
- Testing multiple browser automations and running multiple applications at the same time is common when developing, so memory needs to match the needs
- Automations and Python are typically really file I/O intensive
- Developers need to build and test the Python environments, so here, a fast disk is a crucial thing
- On Windows machines, the requirement for
40GB of free disk space on a fast disk (SSD)
should be on C -drive.- Tooling around Python and the security aspects around users' home folders requires the tools to work on C-drive.
For Workers and running the automation
When executing automations, the system requirements can vary a lot. The most significant factors for the system requirements are:
- The operating system needed:
- Windows needs a lot more than Linux
- The target automation:
- Big websites require a lot of memory
- Heavy Windows applications and virus scanners can eat up a lot of CPU cores and processing power
- Automations that work with thousands of PDF files require a lot of file space and a fast file I/O
Minimum setup:
👉 4 CPU cores, 8GB of memory, 40GB of free disk space on a fast disk (SSD)
- File I/O speed affects execution times a lot.
- Features like video streaming on a system with fewer cores is not recommended
- On Windows machines, the requirement for
40GB of free disk space on a fast disk (SSD)
should be on C -drive.- Tooling around Python and the security aspects around users' home folders requires the tools to work on C-drive.
Examples and references
We recommend getting RCC tool and running rcc config speedtest
to evaluate your system.
The speed test performs a standard action that tests CPU, network, and disk speed to gather relative numbers.
For network
and filesystem
, results above zero are good; values below indicate slower performance.
For reference:
- A good developer machine:
- Dell XPS 15 9520, 16GB RAM, 14 cores (20 logical processors), 500GB NVMe SSD
- Antivirus and disk encryption on
- Result:
Score for network=89, filesystem=96, and time=39 with 19 workers on "windows_amd64".
- Robocorp hosted Cloud Worker running on a Debian Linux Docker container
- The disk on virtualized systems is normally the slowest part, but the Cloud worker still manages an overall good time of just over a minute
- Result:
Score for network=85, filesystem=-131, and time=65 with 7 workers on "linux_amd64".
- Low-end Windows VM from AWS LightSail
- 8 GB RAM, 2 vCPUs, 160 GB SSD
- Price 2024-03-28: $70 / month
- Capable of running Windows desktop automations, but not fast.
- Result:
Score for network=-16, filesystem=-677, and time=220 with 2 workers on "windows_amd64"