

# Create a conda channel using S3
<a name="configure-jobs-s3-channel"></a>

If your jobs need to run applications not available on the [https://docs.aws.amazon.com/deadline-cloud/latest/userguide/create-queue-environment.html#conda-queue-environment](https://docs.aws.amazon.com/deadline-cloud/latest/userguide/create-queue-environment.html#conda-queue-environment) or [https://conda-forge.org/](https://conda-forge.org/) channels, you can host a custom conda channel to serve your own packages. When you create a queue in the AWS Deadline Cloud (Deadline Cloud) console, the console adds a conda queue environment by default. To make your packages available to jobs, add the custom channel to the queue environment.

A conda channel is static hosted content that you can host in [a variety of ways](https://rattler-build.prefix.dev/latest/publish/), including on a filesystem or in an Amazon Simple Storage Service (Amazon S3) bucket. If your Deadline Cloud farm uses a shared filesystem for assets, you can use any path on it as a channel name. You can host the channel in an Amazon S3 bucket for broader access using AWS Identity and Access Management (IAM) permissions.

You can [build and test packages locally](build-test-packages-locally.md), then [publish them to a channel](publish-packages-s3-channel.md). Building packages locally is an easy way to start iterating on package build recipes with no infrastructure setup. You can also use a Deadline Cloud [package building queue](automate-package-builds.md) to build packages and publish them to a channel. A package building queue simplifies maintaining packages for multiple operating systems and accelerator configurations. You can update versions and submit full sets of package builds from anywhere.

You can configure channels for your studio and your Deadline Cloud farm in multiple ways. You can have one Amazon S3 channel and configure all your workstations and farm hosts to use it. You can also have more than one channel and set up mirroring with AWS DataSync (DataSync). For example, your Deadline Cloud package building queue can publish to an Amazon S3 channel that gets mirrored on premises for workstations and on-premises farm hosts.

**Topics**
+ [Build and test packages locally](build-test-packages-locally.md)
+ [Publish packages to an Amazon S3 conda channel](publish-packages-s3-channel.md)
+ [Configure production queue permissions for custom conda packages](#s3-channel-configure-permissions)
+ [Add a conda channel to a queue environment](#s3-channel-add-channel)
+ [Create a conda package for an application or plugin](conda-package.md)
+ [Create a conda build recipe for Blender](create-conda-recipe-blender.md)
+ [Create a conda build recipe for Autodesk Maya](create-conda-recipe-maya.md)
+ [Create a conda build recipe for the Maya adaptor](create-conda-recipe-maya-openjd.md)
+ [Create a conda build recipe for Autodesk Maya to Arnold (MtoA) plugin](create-conda-recipe-mtoa-plugin.md)
+ [Automate package builds with Deadline Cloud](automate-package-builds.md)

# Build and test packages locally
<a name="build-test-packages-locally"></a>

Before publishing packages to Amazon S3 or setting up CI/CD automation on your Deadline Cloud farm, you can build and test conda packages on your workstation using a local filesystem channel. This approach lets you rapidly iterate locally on recipes and verify packages.

The `rattler-build publish` command builds a recipe, copies the resulting package to a channel, and indexes the channel in one step. When you target a local filesystem directory, `rattler-build` creates and initializes the channel automatically if the directory does not exist.

The following instructions use the Blender 4.5 sample recipe from the [Deadline Cloud samples](https://github.com/aws-deadline/deadline-cloud-samples) repository on GitHub. You can substitute a different recipe from the samples repository or use your own recipe.

## Prerequisites
<a name="build-test-locally-prereqs"></a>

Before you begin, install the following tools on your workstation:
+ **pixi** – A package manager that you use to install `rattler-build` and to test packages. Install pixi from [pixi.sh](https://pixi.sh).
+ **rattler-build** – The package build tool used by Deadline Cloud conda recipes. After you install pixi, run the following command to install `rattler-build`.

  ```
  pixi global install rattler-build
  ```
+ **git** – Required to clone the samples repository. On Windows, [git for Windows](https://gitforwindows.org/) also provides a `bash` shell, which some of the Windows sample recipes require.

## Building and publishing a package to a local channel
<a name="build-test-locally-build"></a>

In this procedure, you clone the Deadline Cloud samples repository and use `rattler-build publish` to build and publish the package to a local filesystem channel.

**Note**  
Large applications can require tens of GB of free disk space for the source archive, extracted files, and build output. Make sure that you use a disk with enough available space for the package build output.

**To build and publish a package to a local channel**

1. Clone the Deadline Cloud samples repository.

   ```
   git clone https://github.com/aws-deadline/deadline-cloud-samples.git
   ```

1. Change to the `conda_recipes` directory.

   ```
   cd deadline-cloud-samples/conda_recipes
   ```

1. Run the following command to build the Blender 4.5 recipe and publish the package to a local channel directory.

   On Linux and macOS, run the following command.

   ```
   rattler-build publish blender-4.5/recipe/recipe.yaml \
       --to file://$HOME/my-conda-channel \
       --build-number=+1
   ```

   On Windows (cmd), run the following command.

   ```
   rattler-build publish blender-4.5/recipe/recipe.yaml ^
       --to file://%USERPROFILE%/my-conda-channel ^
       --build-number=+1
   ```

   The `rattler-build publish` command performs the following actions:
   + Builds the package from the recipe.
   + Creates the channel directory if the directory does not exist.
   + Copies the package file to the channel.
   + Indexes the channel so that package managers can find the package.

   If your package recipe depends on packages from a particular channel, such as [conda-forge](https://conda-forge.org/), add `-c conda-forge` to the command.

**About build numbers**  
The `--build-number=+1` option automatically picks the next build number based on what already exists in the destination channel. The best practice is to never overwrite a package in a channel. Always build to a new build number if the package would otherwise have the same filename. Using `--build-number=+1` achieves this when you build to a production channel or a staging channel that mirrors production.  
If you want to control the build number directly, you can set it with a specific value such as `--build-number=7`. If you omit the option, `rattler-build` uses the build number defined in the `recipe.yaml` file.

For more information about `rattler-build publish`, see the [rattler-build publish documentation](https://rattler-build.prefix.dev/latest/publish/).

## Debugging builds
<a name="build-test-locally-debug"></a>

If a build fails, `rattler-build` preserves the build directory so you can investigate. Run the following command to open an interactive shell in the build environment with all environment variables set up as they were during the build.

```
rattler-build debug shell
```

From the debug shell, you can modify files, run individual build commands, and add dependencies to isolate the issue. For more information, see [Debugging builds](https://rattler-build.prefix.dev/latest/debugging_builds/) in the rattler-build documentation.

## Testing the package
<a name="build-test-locally-test"></a>

After you build and publish the package, create a temporary pixi project. Use the project to install the package from the local channel and verify that it works correctly.

**To test the package**

1. Create a temporary test directory and initialize a pixi project with the local channel.

   On Linux and macOS, run the following commands.

   ```
   mkdir package-test-env
   cd package-test-env
   pixi init --channel file://$HOME/my-conda-channel
   ```

   On Windows (cmd), run the following commands.

   ```
   mkdir package-test-env
   cd package-test-env
   pixi init --channel file://%USERPROFILE%/my-conda-channel
   ```

1. Add the package to the project.

   ```
   pixi add blender=4.5
   ```

1. Verify that the package works correctly.

   ```
   pixi run blender --version
   ```

   The [https://pixi.sh/latest/reference/cli/pixi/run/](https://pixi.sh/latest/reference/cli/pixi/run/) command activates the conda environment for the project directory and runs the specified command within it. The environment persists in the project directory, so you can use the same `pixi run` command from other terminals.

When you are satisfied with the package, you can publish the package to an Amazon S3 conda channel so that Deadline Cloud workers can install the package. See [Publish packages to an S3 conda channel](publish-packages-s3-channel.md).

## Removing packages from the channel
<a name="build-test-locally-remove-packages"></a>

Avoid removing packages from channels that you use for production, because lockfiles reference specific packages by hash. Removing a package prevents re-creating environments from those lockfiles. For development and testing channels, you can remove a specific package by deleting the `.conda` file from the channel directory and then re-indexing the channel. First, install `rattler-index`.

```
pixi global install rattler-index
```

Then delete the package file and re-index the channel.

On Linux and macOS, run the following commands.

```
rm $HOME/my-conda-channel/linux-64/blender-4.5.0-hb0f4dca_1.conda
rattler-index fs $HOME/my-conda-channel
```

On Windows (cmd), run the following commands.

```
del %USERPROFILE%\my-conda-channel\win-64\blender-4.5.0-hb0f4dca_1.conda
rattler-index fs %USERPROFILE%\my-conda-channel
```

Package files are stored in platform-specific subdirectories such as `linux-64`, `win-64`, or `osx-arm64`. List the contents of these subdirectories to find the exact filename of the package you want to remove.

## Cleaning up
<a name="build-test-locally-cleanup"></a>

After testing, you can remove the test project and the local channel.

**To clean up test resources**

1. Remove the test project directory.

   On Linux and macOS, run the following command.

   ```
   rm -rf package-test-env
   ```

   On Windows (cmd), run the following command.

   ```
   rmdir /s /q package-test-env
   ```

1. Remove the local conda channel directory.

   On Linux and macOS, run the following command.

   ```
   rm -rf $HOME/my-conda-channel
   ```

   On Windows (cmd), run the following command.

   ```
   rmdir /s /q %USERPROFILE%\my-conda-channel
   ```

1. (Optional) Remove the `rattler-build` output directory that contains the built package file.

   On Linux and macOS, run the following command.

   ```
   rm -rf deadline-cloud-samples/conda_recipes/output
   ```

   On Windows (cmd), run the following command.

   ```
   rmdir /s /q deadline-cloud-samples\conda_recipes\output
   ```

# Publish packages to an Amazon S3 conda channel
<a name="publish-packages-s3-channel"></a>

You can publish conda packages to an Amazon Simple Storage Service (Amazon S3) bucket so that AWS Deadline Cloud (Deadline Cloud) workers can install them for running jobs. The `rattler-build publish` command works with Amazon S3 the same way as with a local filesystem channel. The command can build a recipe and publish the result, or publish a package file that you already built. In both cases, the command uploads the package to the bucket and indexes the channel in one step.

The `rattler-build publish` command authenticates with AWS using the standard credential chain, so it uses your AWS configuration like any AWS tool. For more information about configuring credentials, see [Configuration and credential file settings](https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-files.html) in the *AWS Command Line Interface (AWS CLI) User Guide*.

## Prerequisites
<a name="publish-s3-prereqs"></a>

Before you publish packages to Amazon S3, complete the following prerequisites:
+ **pixi and rattler-build** – Install pixi from [pixi.sh](https://pixi.sh), then install `rattler-build`.

  ```
  pixi global install rattler-build
  ```
+ **git** – Required to clone the samples repository. On Windows, [git for Windows](https://gitforwindows.org/) also provides a `bash` shell, which some of the Windows sample recipes require.
+ **Amazon S3 bucket** – An Amazon S3 bucket to use as the conda channel. You can use the job attachments bucket from your Deadline Cloud farm or create a separate bucket.
+ **AWS credentials** – Configure credentials on your workstation using the `aws configure` command or the `aws login` command. For more information, see [Setting up the AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/getting-started-quickstart.html) in the *AWS Command Line Interface User Guide*.
+ **IAM permissions** – (Optional) To reduce the scope of permissions your credentials have, you can use an AWS Identity and Access Management (IAM) policy that only grants the following permissions on the Amazon S3 bucket and the channel prefix you use (for example, `/Conda/*`):
  + `s3:GetObject`
  + `s3:PutObject`
  + `s3:DeleteObject`
  + `s3:ListBucket`
  + `s3:GetBucketLocation`

## Publishing a package to an Amazon S3 channel
<a name="publish-s3-procedure"></a>

Use `rattler-build publish` with an `s3://` target to publish a package to your Amazon S3 conda channel. If the channel does not exist in the bucket, `rattler-build` initializes the channel automatically. Before you begin, make sure that you have completed the [prerequisites](#publish-s3-prereqs).

The following example publishes the Blender 4.5 sample recipe from the [Deadline Cloud samples](https://github.com/aws-deadline/deadline-cloud-samples) repository on GitHub. You can substitute a different recipe from the samples repository or use your own recipe.

**Note**  
Large applications can require tens of GB of free disk space for the source archive, extracted files, and build output. Make sure that you use a disk with enough available space for the package build output.

**To publish a package to an Amazon S3 channel**

1. Clone the Deadline Cloud samples repository.

   ```
   git clone https://github.com/aws-deadline/deadline-cloud-samples.git
   ```

1. Change to the `conda_recipes` directory.

   ```
   cd deadline-cloud-samples/conda_recipes
   ```

1. Run the following command. Replace *amzn-s3-demo-bucket* with your bucket name.

   ```
   rattler-build publish blender-4.5/recipe/recipe.yaml --to s3://amzn-s3-demo-bucket/Conda/Default --build-number=+1
   ```

   The `/Conda/Default` prefix organizes the channel within the bucket. You can use a different prefix, but the prefix must be consistent across all commands and queue configurations that reference the channel.

**About build numbers**  
The `--build-number=+1` option automatically picks the next build number based on what already exists in the destination channel. The best practice is to never overwrite a package in a channel. Always build to a new build number if the package would otherwise have the same filename. Using `--build-number=+1` achieves this when you build to a production channel or a staging channel that mirrors production.  
If you want to control the build number directly, you can set it with a specific value such as `--build-number=7`. If you omit the option, `rattler-build` uses the build number defined in the `recipe.yaml` file.

If your package recipe depends on packages from a particular channel, such as [conda-forge](https://conda-forge.org/), add `-c conda-forge` to the command.

You can also publish a package file that you already built, for example, a `.conda` file from a local build. Replace *amzn-s3-demo-bucket* with your bucket name.

```
rattler-build publish output/linux-64/blender-4.5.0-hb0f4dca_0.conda \
    --to s3://amzn-s3-demo-bucket/Conda/Default
```

## Testing the package
<a name="publish-s3-test"></a>

After you publish the package, create a temporary pixi project to verify that the package works correctly. The project installs the package from the Amazon S3 channel.

**To test the package**

1. Create a temporary test directory and initialize a pixi project with the Amazon S3 channel. Replace *amzn-s3-demo-bucket* with your bucket name.

   ```
   mkdir package-test-env
   cd package-test-env
   pixi init --channel s3://amzn-s3-demo-bucket/Conda/Default
   ```

1. Add the package to the project.

   ```
   pixi add blender=4.5
   ```

1. Verify that the package works correctly.

   ```
   pixi run blender --version
   ```

   The [https://pixi.sh/latest/reference/cli/pixi/run/](https://pixi.sh/latest/reference/cli/pixi/run/) command activates the conda environment for the project directory and runs the specified command within it. The environment persists in the project directory, so you can use the same `pixi run` command from other terminals.

## Removing packages from the channel
<a name="publish-s3-remove-packages"></a>

Avoid removing packages from channels that you use for production, because lockfiles reference specific packages by hash. Removing a package prevents re-creating environments from those lockfiles. For development and testing channels, you can remove a specific package by deleting the `.conda` file from the bucket and then re-indexing the channel. First, install `rattler-index`.

```
pixi global install rattler-index
```

Then delete the package file and re-index the channel. Replace *amzn-s3-demo-bucket* with your bucket name.

```
aws s3 rm s3://amzn-s3-demo-bucket/Conda/Default/linux-64/blender-4.5.0-hb0f4dca_1.conda
rattler-index s3 s3://amzn-s3-demo-bucket/Conda/Default
```

Package files are stored in platform-specific subdirectories such as `linux-64`, `win-64`, or `osx-arm64`. To list the packages in a subdirectory, run the following command.

```
aws s3 ls s3://amzn-s3-demo-bucket/Conda/Default/linux-64/
```

## Cleaning up
<a name="publish-s3-cleanup"></a>

After testing, remove the test project directory.

**To clean up test resources**
+ Remove the test project directory.

  On Linux and macOS, run the following command.

  ```
  rm -rf package-test-env
  ```

  On Windows (cmd), run the following command.

  ```
  rmdir /s /q package-test-env
  ```

## Debugging builds
<a name="publish-s3-debug"></a>

If a build fails, `rattler-build` preserves the build directory so you can investigate. Run the following command to open an interactive shell in the build environment with all environment variables set up as they were during the build.

```
rattler-build debug shell
```

From the debug shell, you can modify files, run individual build commands, and add dependencies to isolate the issue. For more information, see [Debugging builds](https://rattler-build.prefix.dev/latest/debugging_builds/) in the rattler-build documentation.

## Building packages for other platforms
<a name="publish-s3-cross-platform"></a>

The `rattler-build publish` command builds packages for the operating system of the workstation where the command runs. If your Deadline Cloud fleet uses a different operating system than your workstation, or if your package has other host requirements, you have the following options:
+ Run `rattler-build publish` on a host that matches the target operating system. For example, use an Amazon Elastic Compute Cloud (Amazon EC2) instance running Linux to build packages for a Linux fleet.
+ Use a Deadline Cloud package building queue to automate builds on the target platform. See [Create a package building queue](automate-package-builds.md#s3-channel-create-queue).
+ (Advanced) Use cross-compilation to build packages for a different platform from your workstation. For more information, see [Cross-compilation](https://rattler-build.prefix.dev/latest/compilers/#cross-compilation) in the rattler-build documentation.

## Next steps
<a name="publish-s3-next-steps"></a>

After you publish packages to your Amazon S3 conda channel, configure your Deadline Cloud queues to use the channel:
+ [Configure production queue permissions for custom conda packages](configure-jobs-s3-channel.md#s3-channel-configure-permissions) – Grant your production queues read-only access to the Amazon S3 conda channel.
+ [Add a conda channel to a queue environment](configure-jobs-s3-channel.md#s3-channel-add-channel) – Configure the queue environment to install packages from the Amazon S3 conda channel.

## Configure production queue permissions for custom conda packages
<a name="s3-channel-configure-permissions"></a>

Your production queue needs read-only permissions to the `/Conda` prefix in the queue's S3 bucket. Open the AWS Identity and Access Management (IAM) page for the role associated with the production queue and modify the policy with the following:

1. Open the Deadline Cloud console and navigate to the queue details page for the package build queue.

1. Choose the queue service role, then choose **Edit queue**.

1. Scroll to the **Queue service role** section, then choose **View this role in the IAM console**.

1. From the list of permission policies, choose the **AmazonDeadlineCloudQueuePolicy** for your queue.

1. From the **Permissions** tab, choose **Edit**.

1. Add a new section to the queue service role like the following. Replace *amzn-s3-demo-bucket* and *111122223333* with your own bucket and account.

   ```
   {
      "Effect": "Allow",
      "Sid": "CustomCondaChannelReadOnly",
      "Action": [
       "s3:GetObject",
       "s3:ListBucket"
      ],
      "Resource": [
       "arn:aws:s3:::amzn-s3-demo-bucket",
       "arn:aws:s3:::amzn-s3-demo-bucket/Conda/*"
      ],
      "Condition": {
       "StringEquals": {
        "aws:ResourceAccount": "111122223333"
       }
      }
     },
   ```

## Add a conda channel to a queue environment
<a name="s3-channel-add-channel"></a>

To use the S3 conda channel, you need to add the `s3://amzn-s3-demo-bucket/Conda/Default` channel location to the `CondaChannels` parameter of jobs that you submit to Deadline Cloud. The submitters provided with Deadline Cloud provide fields to specify custom conda channels and package.

You can avoid modifying every job by editing the conda queue environment for your production queue. Use the following procedure:

1. Open the Deadline Cloud console and navigate to the queue details page for the production queue.

1. Choose the environments tab.

1. Select the **Conda** queue environment, and then choose **Edit**.

1. Choose the **JSON editor**, and then in the script, find the parameter definition for `CondaChannels`.

1. Edit the line `default: "deadline-cloud"` so that it starts with the newly created S3 conda channel:

   ```
   default: "s3://amzn-s3-demo-bucket/Conda/Default deadline-cloud"
   ```

Service-managed fleets enable flexible channel priority for conda by default. For a job requesting `blender=4.5` if Blender 4.5 is in both the new channel and the `deadline-cloud` channel, the package will be pulled from whichever channel is first in the channel list. If a specified package version is not found in the first channel then subsequent channels will be checked in order for the package version.

For customer-managed fleets, you can enable the use of conda packages by using one of the [conda queue environment samples](https://github.com/aws-deadline/deadline-cloud-samples/blob/mainline/queue_environments/README.md) in the Deadline Cloud samples GitHub repository.

# Create a conda package for an application or plugin
<a name="conda-package"></a>

A conda package is a compressed archive of software written in any language. Conda supports a variety of operating system and architecture combinations, so you can package full applications like Blender, Maya, and Nuke alongside libraries for Python and other languages. For more information about conda packages, see [Packages](https://docs.conda.io/projects/conda/en/latest/user-guide/concepts/packages.html) in the conda documentation.

To use a conda package, you install it into a virtual environment. A conda virtual environment has a *prefix directory* where packages are installed. Installing a package uses hardlinking or reflinking of files when supported, so creating multiple environments with the same packages does not use significant additional disk space. To use a virtual environment, you activate it to set environment variables. Activation runs scripts that packages provide, giving each package the opportunity to modify PATH or other environment variables. Conda packages typically contain applications or libraries, but the flexible activation means they can also point to applications installed on a shared filesystem.

Making a custom package involves three stages: a *recipe* contains the build instructions, a *package* is the built artifact (`.conda` or `.tar.bz2` file), and a *channel* hosts packages for installation. The `rattler-build publish` command handles all three steps—it can build a recipe into a package and publish to a channel, or it can take a package artifact directly to publish it.

The [conda-forge](https://conda-forge.org/) community maintains package recipes for a broad set of open source software, and hosts package artifacts in the `conda-forge` channel. You can configure your queue to include `conda-forge` as a package source, and then build custom packages that depend on conda-forge packages to run. For Linux, conda-forge hosts a full compiler toolchain including CUDA support, with consistent compiling and linking options selected. You can use conda-forge packages as dependencies in your own recipes, or install them alongside your custom packages in the same environment.

You can combine an entire application, including dependencies, into a conda package. The packages Deadline Cloud provides in the [ deadline-cloud channel](https://docs.aws.amazon.com/deadline-cloud/latest/userguide/create-queue-environment.html#conda-queue-environment) for service-managed fleets use this binary repackaging approach. This organizes the same files as an installation to fit the conda virtual environment.

**Note**  
Large applications can require tens of GB of free disk space for the source archive, extracted files, and build output. Make sure that you use a disk with enough available space for the package build output.

## Package an application
<a name="conda-package-application"></a>

When repackaging an application for conda, there are two goals:
+ Most files for the application should be separate from the primary conda virtual environment structure. Environments can then mix the application with packages from other sources like [conda-forge](https://conda-forge.org/).
+ When a conda virtual environment is activated, the application should be available from the PATH environment variable.

**To repackage an application for conda**

1. Write conda build recipes that install the application into a subdirectory like `$CONDA_PREFIX/opt/<application-name>`. This separates it from the standard prefix directories like `bin` and `lib`.

1. Add symlinks or launch scripts to `$CONDA_PREFIX/bin` to run the application binaries.

   Alternatively, create activate.d scripts that the `conda activate` command will run to add the application binary directories to the PATH. On Windows, where symlinks are not supported everywhere environments can be created, use application launch or activate.d scripts instead.

1. Some applications depend on libraries not installed by default on Deadline Cloud service-managed fleets. For example, the X11 window system is usually unnecessary for non-interactive jobs, but some applications still require it to run without a graphical interface. You must provide those dependencies within the package you create.

1. If the application supports plugins, provide a clear convention that plugin packages should follow to integrate with the application in a virtual environment. For example, the [Maya 2026 sample recipe](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/conda_recipes/maya-2026#instructions-for-maya-plugin-packages) documents this convention for Maya plugins.

1. Ensure you follow the copyright and license agreements for the applications you package. We recommend using a private Amazon S3 bucket for your conda channel to control distribution and limit package access to your farm.

Sample recipes for the packages in the `deadline-cloud` channel are available in the [Deadline Cloud samples](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/conda_recipes#readme) repository on GitHub.

## Package a plugin
<a name="conda-package-plugins"></a>

Application plugins can be packaged as their own conda packages. When creating a plugin package, follow these guidelines:
+ Include the host application package as both a build and a run dependency in the build recipe `recipe.yaml`. Use a version constraint so that the build recipe is only installed with compatible packages.
+ Follow the host application package conventions for registering the plugin.

## Adaptor packages
<a name="conda-package-adaptors"></a>

Some Deadline Cloud application integrations use an *adaptor* that extends the application interface to simplify [writing job templates](building-jobs.md). An adaptor is a command-line interface with support for running a background daemon, reporting status, and applying path mapping. For more information, see the [Open Job Description Adaptor Runtime](https://github.com/OpenJobDescription/openjd-adaptor-runtime-for-python#readme) on GitHub. For example, [deadline-cloud-for-maya](https://github.com/aws-deadline/deadline-cloud-for-maya/) on GitHub includes an integrated job submission GUI and a Maya adaptor that is available as the `maya-openjd` package on service-managed fleets.

Job submissions from Deadline Cloud submitter GUIs include a `CondaPackages` parameter value that specifies the conda packages to include in a virtual environment for running the job. The `CondaPackages` parameter value for Maya typically looks like `maya=2026.* maya-openjd=0.15.* maya-mtoa` and might contain alternative entries for plugin packages. When the queue environment sets up a conda virtual environment for running the job, it resolves these package names and version constraints to be compatible and adds all the dependency packages they need to run. Each adaptor and plugin package specifies what it is compatible with, including which versions of Maya, which versions of Python, and other dependencies.

To build your own adaptor packages using our samples such as the [maya-openjd recipe](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/conda_recipes/maya-openjd) on GitHub, you can build on the packages for Python and other dependencies provided by [conda-forge](https://conda-forge.org/). You might need to build the [deadline](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/conda_recipes/deadline) and [openjd-adaptor-runtime](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/conda_recipes/openjd-adaptor-runtime) recipes first to satisfy dependencies.

# Create a conda build recipe for Blender
<a name="create-conda-recipe-blender"></a>

Blender is free to use and simple to package with conda, which makes it a good starting point for learning how to create conda packages for AWS Deadline Cloud (Deadline Cloud). The Blender Foundation provides [application archives](https://download.blender.org/release/Blender4.5/) for multiple operating systems. The [Blender 4.5 sample recipe](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/conda_recipes/blender-4.5) in the Deadline Cloud samples repository on GitHub packages these archives into a conda package.

## Understanding the recipe
<a name="blender-recipe-structure"></a>

The [recipe.yaml](https://github.com/aws-deadline/deadline-cloud-samples/blob/mainline/conda_recipes/blender-4.5/recipe/recipe.yaml) file defines the package metadata, source URLs, and build options in [rattler-build template syntax](https://rattler-build.prefix.dev/latest/reference/recipe_file/#spec-reference). The recipe specifies the version number once and provides different source URLs based on the operating system.

The `build` section in `recipe.yaml` turns off binary relocation and dynamic shared object (DSO) linking checks. These options control how the package works when installed into a conda virtual environment at any directory prefix. The default values used in the `build` section are designed for packaging each dependency library separately, but when binary repackaging an application, you need to change them. Blender does not require any RPATH adjustment because the application archives are built with relocatability in mind. See [Create a conda recipe for Maya](create-conda-recipe-maya.md) for an example of adding relocatability.

During the package build, the [build.sh](https://github.com/aws-deadline/deadline-cloud-samples/blob/mainline/conda_recipes/blender-4.5/recipe/build.sh) or [build\$1win.sh](https://github.com/aws-deadline/deadline-cloud-samples/blob/mainline/conda_recipes/blender-4.5/recipe/build_win.sh) script runs to install files into the environment. These scripts copy the installation files into `$PREFIX/opt/blender`, create symlinks from `$PREFIX/bin` (on Linux), and set up activation scripts that configure environment variables such as `BLENDER_LOCATION`. On Windows, the activation script adds the Blender directory to the PATH instead of creating symlinks.

The Windows build script uses `bash` instead of a `cmd.exe` .bat file for consistency across platforms. You can install [git for Windows](https://gitforwindows.org/) to provide `bash` for package building.

The recipe also includes a `deadline-cloud.yaml` file that specifies the conda platforms and metadata for submitting automated package build jobs to Deadline Cloud. For more information, see [Submit a package build job](automate-package-builds.md#automate-submit-package-job).

## Building the Blender package
<a name="s3-channel-build-blender"></a>

Use `rattler-build publish` to build the Blender 4.5 recipe and publish the package to a channel. You can publish to a local filesystem channel for testing or directly to an Amazon S3 channel for production use. If you completed the setup in [Build and test packages locally](build-test-packages-locally.md), run the following command from the `conda_recipes` directory.

```
rattler-build publish blender-4.5/recipe/recipe.yaml \
    --to file://$HOME/my-conda-channel \
    --build-number=+1
```

For other publishing options:
+ To publish to an Amazon S3 channel, see [Publish packages to an S3 conda channel](publish-packages-s3-channel.md).
+ To automate builds using a Deadline Cloud package building queue, see [Automate package builds with Deadline Cloud](automate-package-builds.md).

# Test your package with a Blender render job
<a name="s3-channel-submit-job"></a>

After you build the Blender 4.5 package, you can test it with a render job. If you do not have a Blender scene, download the Blender 3.5 - Cozy Kitchen scene from the [Blender demo files](https://www.blender.org/download/demo-files) page. The Deadline Cloud samples repository contains a `blender_render` job bundle and a conda queue environment that you can use for both local and cloud testing.

## Testing locally
<a name="blender-test-locally"></a>

You can run the job template on your workstation using the [Open Job Description CLI](https://github.com/OpenJobDescription/openjd-cli#readme). Install the CLI with `pip`.

```
pip install openjd-cli
```

From the `job_bundles` directory in the samples repository, run the following command. Replace */path/to/scene.blend* with the path to your Blender scene file.

```
openjd run blender_render/template.yaml \
    --environment ../queue_environments/conda_queue_env_pyrattler.yaml \
    -p CondaPackages=blender=4.5 \
    -p CondaChannels=file://$HOME/my-conda-channel \
    -p BlenderSceneFile=/path/to/scene.blend \
    -p Frames=1
```

The `--environment` option applies the conda queue environment, which creates a conda virtual environment with the packages specified in `CondaPackages`. The `CondaChannels` parameter tells the queue environment where to find the packages. If you published to an Amazon S3 channel instead of a local channel, replace the `file://` path with your `s3://` channel URL.

## Testing on Deadline Cloud
<a name="blender-test-deadline-cloud"></a>

After you configure your production queue to use the Amazon S3 conda channel, you can submit the render job to Deadline Cloud. From the `job_bundles` directory in the samples repository, run the following command.

```
deadline bundle submit blender_render \
    -p CondaPackages=blender=4.5 \
    -p BlenderSceneFile=/path/to/scene.blend \
    -p Frames=1
```

Use the Deadline Cloud monitor to track the progress of the job. In the monitor, select the task for the job and choose **View logs**. Select the **Launch Conda** session action to verify that the package was found in the Amazon S3 channel.

# Create a conda build recipe for Autodesk Maya
<a name="create-conda-recipe-maya"></a>

Commercial applications like Autodesk Maya introduce additional packaging requirements compared to open source applications like Blender. The [Blender recipe](create-conda-recipe-blender.md) packages a simple relocatable archive under an open source license. Commercial applications are often distributed through installers and require license management configuration.

## Considerations for commercial applications
<a name="maya-commercial-considerations"></a>

The following considerations apply when packaging commercial applications. The details illustrate how each applies to Maya.
+ **Licensing** – Understand the licensing rights and restrictions of the application. You might need to configure a license management system. Read the [Autodesk Subscription Benefits FAQ about Cloud Rights](https://www.autodesk.com/support/technical/article/caas/sfdcarticles/sfdcarticles/Subscription-Benefits-FAQ-Cloud-Rights.html) to understand the cloud rights for Maya. Autodesk products rely on a `ProductInformation.pit` file that typically requires administrator access to configure. Product features for thin clients provide a relocatable alternative. See [Thin Client Licensing for Maya and MotionBuilder](https://www.autodesk.com/support/technical/article/caas/tsarticles/ts/2zqRBCuGDrcPZDzULJQ27p.html) for more information.
+ **System library dependencies** – Some applications depend on libraries not installed on service-managed fleet worker hosts. Maya depends on libraries including freetype and fontconfig. When these libraries are available in the system package manager, such as `dnf` for AL2023, you can use the package manager as a source. Because RPM packages are not built to be relocatable, you need to use tools such as `patchelf` to resolve dependencies within the Maya installation prefix.
+ **Administrator access for installation** – Some installers require administrator access. Service-managed fleets do not provide administrator access, so you need to install the application on a separate system and create an archive of the files for the package build. The Windows installer for Maya requires this approach. The [README.md](https://github.com/aws-deadline/deadline-cloud-samples/blob/mainline/conda_recipes/maya-2026/README.md) in the recipe documents a repeatable procedure using a freshly launched Amazon Elastic Compute Cloud (Amazon EC2) instance.
+ **Plugin integration** – The sample Maya package defines `MAYA_NO_HOME=1` to isolate the application from user-level configuration, and adds module search paths to `MAYA_MODULE_PATH` so that plugin packages can place `.mod` files within the virtual environment. See the [Maya 2026 sample recipe](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/conda_recipes/maya-2026#instructions-for-maya-plugin-packages) for the full plugin integration convention.

## Understanding the recipe
<a name="maya-recipe-structure"></a>

The [recipe.yaml](https://github.com/aws-deadline/deadline-cloud-samples/blob/mainline/conda_recipes/maya-2026/recipe/recipe.yaml) file defines the package metadata in [rattler-build template syntax](https://rattler-build.prefix.dev/latest/reference/recipe_file/#spec-reference). Review the following sections of the file:
+ **source** – References the installer archives, including the sha256 hash. On Linux, the source is the Autodesk installer archive. On Windows, the source includes both the installer archive and a `cleanMayaForCloud.py` script from Autodesk that prepares Maya for cloud deployment. Update the hashes when you change the source files, for example when packaging a new version.
+ **build** – Turns off the default binary relocation and DSO linking checks because the automatic mechanisms do not work correctly for the library and binary directories that Maya uses. On Linux, the recipe includes `patchelf` as a build dependency to manually set relative RPATHs.
+ **about** – Metadata about the application for browsing or processing the contents of a conda channel.

The build scripts ([build.sh](https://github.com/aws-deadline/deadline-cloud-samples/blob/mainline/conda_recipes/maya-2026/recipe/build.sh) for Linux, [build\$1win.sh](https://github.com/aws-deadline/deadline-cloud-samples/blob/mainline/conda_recipes/maya-2026/recipe/build_win.sh) for Windows) include comments explaining each step. The scripts perform the following key tasks:
+ **Extract the installer** – Extracts the Maya installation files into the conda prefix. The Linux and Windows scripts handle this differently due to the installer formats. See the build scripts for details.
+ **Install system library dependencies** – On Linux, the script downloads and extracts system libraries that Maya needs but that are not present on service-managed fleet hosts. The script copies these libraries into the Maya `lib` directory so they are available within the conda environment.
+ **Set relative RPATHs with patchelf** – On Linux, the script uses `patchelf --add-rpath` to add `$ORIGIN`-relative paths to the shared libraries. This approach follows the conda recommendation to never use `LD_LIBRARY_PATH` in conda environments. The script patches libraries at multiple directory levels (`lib`, `lib/python*/site-packages`, `lib/python*/lib-dynload`) so that each library can find its dependencies relative to its own location. The recipe follows the best practice of setting `DT_RUNPATH` instead of `DT_RPATH`, which allows `LD_LIBRARY_PATH` to override the search path when needed for debugging.
+ **Configure thin client licensing** – The script sets up [thin client licensing as documented by Autodesk](https://www.autodesk.com/support/technical/article/caas/tsarticles/ts/2zqRBCuGDrcPZDzULJQ27p.html) so that the `ProductInformation.pit` file can be located within the conda environment rather than requiring system-level administrator access.
+ **Set up activation scripts** – The scripts create activate and deactivate scripts that set environment variables including `MAYA_LOCATION`, `MAYA_VERSION`, `MAYA_NO_HOME`, and `MAYA_MODULE_PATH`. On Windows, the scripts produce both `.sh` and `.bat` activation files because the Deadline Cloud sample queue environments use `bash` to activate environments on Windows.

## Building the Maya package
<a name="maya-build-package"></a>

Before you build the Maya package, download the Maya installer from your Autodesk account. For Linux, place the archive directly into the `conda_recipes/archive_files` directory. For Windows, follow the procedure in the [README.md](https://github.com/aws-deadline/deadline-cloud-samples/blob/mainline/conda_recipes/maya-2026/README.md) to create the archive.

Use `rattler-build publish` to build and publish the package. The Maya recipe requires `patchelf` as a build dependency on Linux, which is available from [conda-forge](https://conda-forge.org/). Add `-c conda-forge` to make the dependency available during the build. From the `conda_recipes` directory, run the following command.

```
rattler-build publish maya-2026/recipe/recipe.yaml \
    --to file://$HOME/my-conda-channel \
    --build-number=+1 \
    -c conda-forge
```

For other publishing options:
+ To publish to an Amazon S3 channel, see [Publish packages to an S3 conda channel](publish-packages-s3-channel.md).
+ To automate builds using a Deadline Cloud package building queue, see [Automate package builds with Deadline Cloud](automate-package-builds.md). To build both Linux and Windows packages, use the `--all-platforms` option with the `submit-package-job` script.

To render the turntable sample with Maya and Arnold, build both the [MtoA plugin](create-conda-recipe-mtoa-plugin.md) and [Maya adaptor](create-conda-recipe-maya-openjd.md) packages. After you publish all three packages, you can submit a test render job using the [turntable with Maya/Arnold](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/job_bundles/turntable_with_maya_arnold) job bundle from the Deadline Cloud samples repository. See [Test your packages with a Maya render job](submit-render-maya-mtoa.md).

# Create a conda build recipe for the Maya adaptor
<a name="create-conda-recipe-maya-openjd"></a>

The `maya-openjd` package provides the adaptor that integrates Maya with AWS Deadline Cloud (Deadline Cloud) job submissions. When you submit a Maya render job using a Deadline Cloud submitter GUI, the `CondaPackages` parameter includes `maya-openjd` alongside the `maya` package. The adaptor handles launching Maya, communicating render parameters, and managing the application lifecycle during a job session. For more information about adaptors, see [Adaptor packages](conda-package.md#conda-package-adaptors).

## Understanding the recipe
<a name="maya-openjd-recipe-structure"></a>

The [maya-openjd sample recipe](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/conda_recipes/maya-openjd) builds the adaptor from the [deadline-cloud-for-maya](https://github.com/aws-deadline/deadline-cloud-for-maya) source package published to PyPI. The [recipe.yaml](https://github.com/aws-deadline/deadline-cloud-samples/blob/mainline/conda_recipes/maya-openjd/recipe/recipe.yaml) installs the package using `pip` into the conda environment.

The recipe depends on Python and two other packages from the Deadline Cloud samples repository that you need to build first:
+ [deadline](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/conda_recipes/deadline) – The Deadline Cloud client library.
+ [openjd-adaptor-runtime](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/conda_recipes/openjd-adaptor-runtime) – The Open Job Description adaptor runtime.

Python and other dependencies are available from [conda-forge](https://conda-forge.org/), so add `-c conda-forge` to the `rattler-build publish` command when you build the adaptor package.

## Building the adaptor package
<a name="maya-openjd-build-package"></a>

The `maya-openjd` package depends on two other packages from the Deadline Cloud samples repository. Build all three packages in order from the `conda_recipes` directory. The `-c conda-forge` option on each command is to satisfy recipe dependencies for Python and Python libraries.

Build the `deadline` package.

```
rattler-build publish deadline/recipe/recipe.yaml \
    --to file://$HOME/my-conda-channel \
    --build-number=+1 \
    -c conda-forge
```

Build the `openjd-adaptor-runtime` package.

```
rattler-build publish openjd-adaptor-runtime/recipe/recipe.yaml \
    --to file://$HOME/my-conda-channel \
    --build-number=+1 \
    -c conda-forge
```

Build the `maya-openjd` package.

```
rattler-build publish maya-openjd/recipe/recipe.yaml \
    --to file://$HOME/my-conda-channel \
    --build-number=+1 \
    -c conda-forge
```

For other publishing options:
+ To publish to an Amazon S3 channel, see [Publish packages to an S3 conda channel](publish-packages-s3-channel.md).
+ To automate builds using a Deadline Cloud package building queue, see [Automate package builds with Deadline Cloud](automate-package-builds.md).

# Create a conda build recipe for Autodesk Maya to Arnold (MtoA) plugin
<a name="create-conda-recipe-mtoa-plugin"></a>

The Maya to Arnold (MtoA) plugin adds the Arnold renderer as an option within Maya. The [MtoA sample recipe](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/conda_recipes/maya-mtoa-2026) demonstrates how to package a plugin as a separate conda package that integrates with the host application package.

## Understanding the recipe
<a name="mtoa-recipe-structure"></a>

The [recipe.yaml](https://github.com/aws-deadline/deadline-cloud-samples/blob/mainline/conda_recipes/maya-mtoa-2026/recipe/recipe.yaml) specifies a dependency on the `maya` package for both build and run requirements. This dependency uses a version constraint so that the plugin is only installed with a compatible Maya version.

The recipe uses the same source archives as the Maya recipe. The build script installs MtoA and creates a `mtoa.mod` file in the `$PREFIX/usr/autodesk/maya$MAYA_VERSION/modules` directory that the Maya package configures in `MAYA_MODULE_PATH`. Arnold and Maya use the same licensing technology, so the Maya package already includes the licensing information that Arnold needs.

## Building the MtoA package
<a name="mtoa-build-package"></a>

Build the Maya package before you build the MtoA package, because MtoA depends on Maya at build time. Use `rattler-build publish` to build and publish the package. From the `conda_recipes` directory, run the following command.

```
rattler-build publish maya-mtoa-2026/recipe/recipe.yaml \
    --to file://$HOME/my-conda-channel \
    --build-number=+1
```

The `rattler-build publish` command uses the target channel as the highest priority channel when resolving dependencies, so the `maya` package you published earlier is available automatically.

For other publishing options:
+ To publish to an Amazon S3 channel, see [Publish packages to an S3 conda channel](publish-packages-s3-channel.md).
+ To automate builds using a Deadline Cloud package building queue, see [Automate package builds with Deadline Cloud](automate-package-builds.md).

# Test your packages with a Maya render job
<a name="submit-render-maya-mtoa"></a>

After you build the Maya, MtoA, and `maya-openjd` packages, you can test them with a render job. The Deadline Cloud samples repository contains a [turntable with Maya/Arnold](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/job_bundles/turntable_with_maya_arnold) job bundle that renders an animation using Maya and Arnold. The job bundle also uses FFmpeg to encode a video, which is available from the `conda-forge` channel.

## Testing locally
<a name="maya-test-locally"></a>

You can run the job template on your workstation using the [Open Job Description CLI](https://github.com/OpenJobDescription/openjd-cli#readme). Install the CLI with `pip`.

```
pip install openjd-cli
```

From the `job_bundles` directory in the samples repository, run the following command. The `ErrorOnArnoldLicenseFail=false` parameter tells Arnold to render with watermarks instead of failing when no license is available.

```
openjd run turntable_with_maya_arnold/template.yaml \
    --environment ../queue_environments/conda_queue_env_pyrattler.yaml \
    -p CondaPackages="maya maya-mtoa maya-openjd ffmpeg" \
    -p CondaChannels="file://$HOME/my-conda-channel conda-forge" \
    -p ErrorOnArnoldLicenseFail=false \
    -p FrameRange=1-5
```

The `--environment` option applies the conda queue environment, which creates a conda virtual environment with the packages specified in `CondaPackages`. The `CondaChannels` parameter includes both the local channel for your custom packages and `conda-forge` for `ffmpeg`. If you published to an Amazon S3 channel instead of a local channel, replace the `file://` path with your `s3://` channel URL.

When the job completes, the rendered output is in the `turntable_with_maya_arnold/output/` directory.

## Testing on Deadline Cloud
<a name="maya-test-deadline-cloud"></a>

After you configure your production queue to use the Amazon S3 conda channel, submit the render job to Deadline Cloud. Add the `conda-forge` channel to the `CondaChannels` parameter in your conda queue environment to provide a source for `ffmpeg` and the Python dependencies that the adaptor requires. From the `job_bundles` directory in the samples repository, run the following command.

```
deadline bundle submit turntable_with_maya_arnold
```

Use the Deadline Cloud monitor to track the progress of the job. In the monitor, select the task for the job and choose **View logs**. Select the **Launch Conda** session action to verify that the `maya`, `maya-mtoa`, and `maya-openjd` packages were found in the Amazon S3 channel.

# Automate package builds with Deadline Cloud
<a name="automate-package-builds"></a>

For CI/CD workflows or when you need to build packages for multiple operating systems, you can create a Deadline Cloud package building queue. The queue schedules build jobs on your fleet, which build the packages and publish them to your Amazon Simple Storage Service (Amazon S3) conda channel. This simplifies maintaining continuous package builds for software releases across all your required configurations.

You can create a package building queue using an AWS CloudFormation (CloudFormation) template, or manually from the Deadline Cloud console. The CloudFormation template deploys a complete farm with a production queue and a package building queue already configured. Creating the queue from the console gives you more control over individual settings.

## Create a package building queue with CloudFormation
<a name="s3-channel-create-queue-cfn"></a>

You can use a CloudFormation template to create a Deadline Cloud farm that includes a package building queue. The template configures a production queue and a package building queue with a private Amazon S3 conda channel.

Before you deploy the template, create an Amazon S3 bucket to hold job attachments and your conda channel. You can create a bucket from the [Amazon S3 console](https://console.aws.amazon.com/s3/). You need the bucket name when you deploy the template.

**To deploy the CloudFormation template**

1. Download the [deadline-cloud-starter-farm-template.yaml](https://github.com/aws-deadline/deadline-cloud-samples/raw/mainline/cloudformation/farm_templates/starter_farm/deadline-cloud-starter-farm-template.yaml) template from the [Deadline Cloud samples](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/cloudformation/farm_templates/starter_farm) repository on GitHub.

1. From the [CloudFormation console](https://console.aws.amazon.com/cloudformation/), choose **Create Stack**, then **With new resources (standard)**.

1. Select the option to upload a template file, then upload the `deadline-cloud-starter-farm-template.yaml` file.

1. Enter a name for the stack, such as **StarterFarm**, and provide the name of an Amazon S3 bucket for job attachments and the conda channel.

1. Follow the CloudFormation console steps to complete stack creation.

For more information about the template parameters and customization options, see the [starter farm README](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/cloudformation/farm_templates/starter_farm) in the Deadline Cloud samples repository on GitHub.

## Create a package building queue from the console
<a name="s3-channel-create-queue"></a>

Follow the instructions in [Create a queue](https://docs.aws.amazon.com/deadline-cloud/latest/userguide/create-queue.html) in the *Deadline Cloud User Guide*. Make the following changes:
+ In step 5, choose an existing Amazon S3 bucket. Specify a root folder name such as **DeadlineCloudPackageBuild** so that build artifacts stay separate from your normal Deadline Cloud attachments.
+ In step 6, you can associate the package building queue with an existing fleet, or you can create an entirely new fleet if your current fleet is unsuitable.
+ In step 9, create a new service role for your package building queue. You will modify the permissions to give the queue the permissions required for uploading packages and reindexing a conda channel.

### Configure the package building queue permissions
<a name="package-building-queue-permissions"></a>

To allow the package building queue to access the `/Conda` prefix in the queue's Amazon S3 bucket, you must modify the queue's role to give it read/write access. The role needs the following permissions so that package build jobs can upload new packages and reindex the channel.
+ `s3:GetObject`
+ `s3:PutObject`
+ `s3:ListBucket`
+ `s3:GetBucketLocation`
+ `s3:DeleteObject`

1. Open the Deadline Cloud console and navigate to the queue details page for the package build queue.

1. Choose the queue service role, then choose **Edit queue**.

1. Scroll to the **Queue service role** section, then choose **View this role in the IAM console**.

1. From the list of permission policies, choose the **AmazonDeadlineCloudQueuePolicy** for your queue.

1. From the **Permissions** tab, choose **Edit**.

1. Add a new section to the queue service role like the following. Replace *amzn-s3-demo-bucket* and *111122223333* with your own bucket and account.

   ```
   {
      "Effect": "Allow",
      "Sid": "CustomCondaChannelReadWrite",
      "Action": [
       "s3:GetObject",
       "s3:PutObject",
       "s3:DeleteObject",
       "s3:ListBucket",
       "s3:GetBucketLocation"
      ],
      "Resource": [
       "arn:aws:s3:::amzn-s3-demo-bucket",
       "arn:aws:s3:::amzn-s3-demo-bucket/Conda/*"
      ],
      "Condition": {
       "StringEquals": {
        "aws:ResourceAccount": "111122223333"
       }
      }
     },
   ```

## Submit a package build job
<a name="automate-submit-package-job"></a>

After you create a package building queue and configure the queue permissions, you can submit jobs to build conda packages. The `submit-package-job` script in the [Deadline Cloud samples](https://github.com/aws-deadline/deadline-cloud-samples/tree/mainline/conda_recipes#readme) repository on GitHub submits a build job for a conda recipe.

You need the following:
+ The [Deadline Cloud CLI](https://github.com/aws-deadline/deadline-cloud) installed on your workstation.
+ An active [AWS Deadline Cloud monitor (Deadline Cloud monitor)](https://docs.aws.amazon.com/deadline-cloud/latest/userguide/working-with-deadline-monitor.html) login session.
+ A clone of the [Deadline Cloud samples](https://github.com/aws-deadline/deadline-cloud-samples) repository.

**To submit a package build job**

1. Open the Deadline Cloud configuration GUI and set the default farm and queue to your package building queue.

   ```
   deadline config gui
   ```

1. Change to the `conda_recipes` directory in the samples repository.

   ```
   cd deadline-cloud-samples/conda_recipes
   ```

1. Run the `submit-package-job` script with the recipe directory. The following example builds the Blender 4.5 recipe.

   ```
   ./submit-package-job blender-4.5/
   ```

   If the recipe requires a source archive that you have not yet downloaded, the script provides download instructions. Download the archive and run the script again.

After you submit the job, use the Deadline Cloud monitor to view the progress and status of the job.

![\[The Deadline Cloud monitor showing the progress and status of a job building a conda package.\]](http://docs.aws.amazon.com/deadline-cloud/latest/developerguide/images/Conda-Figure3.png)


The monitor shows the two steps of the job: building the package and then reindexing the conda channel. When you right-click on the task for the package building step and choose **View logs**, the monitor shows the session actions:
+ **Sync attachments** – Copies the input job attachments or mounts a virtual file system.
+ **Launch Conda** – The queue environment action. The build job doesn't specify conda packages, so this action finishes quickly.
+ **Launch CondaBuild Env** – Creates a conda virtual environment with the software needed to build a conda package and reindex a channel.
+ **Task run** – Builds the package and uploads the results to Amazon S3.

As the actions run, they send logs to Amazon CloudWatch (CloudWatch). When a job is complete, select **View logs for all tasks** to see additional logs about the setup and teardown of the environment.