From 0fe09cd6c8f3f5d9a5dd357088ce94a12280810e Mon Sep 17 00:00:00 2001 From: sekyondaMeta <127536312+sekyondaMeta@users.noreply.github.com> Date: Thu, 13 Nov 2025 12:00:12 -0500 Subject: [PATCH 1/3] Update index.rst --- index.rst | 8 ++++++++ 1 file changed, 8 insertions(+) diff --git a/index.rst b/index.rst index 8da6d99092..f3b9fa2449 100644 --- a/index.rst +++ b/index.rst @@ -688,6 +688,14 @@ Welcome to PyTorch Tutorials :link: intermediate/monarch_distributed_tutorial.html :tags: Parallel-and-Distributed-Training + +.. customcarditem:: + :header: Interactive Distributed Applications with Monarch + :card_description: Learn how to use Monarch's actor framework with TorchTitan to simplify large-scale distributed training across SLURM clusters. + :image: _static/img/thumbnails/cropped/generic-pytorch-logo.png + :link: intermediate/monarch_distributed_tutorial.html + :tags: Parallel-and-Distributed-Training + .. Edge .. customcarditem:: From b023793a1c68fb6a38615234232c4b982962e588 Mon Sep 17 00:00:00 2001 From: sekyondaMeta <127536312+sekyondaMeta@users.noreply.github.com> Date: Thu, 13 Nov 2025 12:33:31 -0500 Subject: [PATCH 2/3] Update index.rst --- index.rst | 1 + 1 file changed, 1 insertion(+) diff --git a/index.rst b/index.rst index f3b9fa2449..5a5e80abfb 100644 --- a/index.rst +++ b/index.rst @@ -7,6 +7,7 @@ Welcome to PyTorch Tutorials * `Supporting Custom C++ Classes in torch.compile/torch.export `__ * `Accelerating torch.save and torch.load with GPUDirect Storage `__ * `Getting Started with Fully Sharded Data Parallel (FSDP2) `__ +* `Interactive Distributed Applications with Monarch `__ .. raw:: html From 351323c3ad269dda85ee6a17104d1afc2ea84c1c Mon Sep 17 00:00:00 2001 From: sekyondaMeta <127536312+sekyondaMeta@users.noreply.github.com> Date: Thu, 13 Nov 2025 12:45:43 -0500 Subject: [PATCH 3/3] Update distributed.rst --- distributed.rst | 17 +++++++++++++++++ 1 file changed, 17 insertions(+) diff --git a/distributed.rst b/distributed.rst index 5ae3079555..8fe636d725 100644 --- a/distributed.rst +++ b/distributed.rst @@ -16,6 +16,7 @@ PyTorch with each method having their advantages in certain use cases: * `Tensor Parallel (TP) <#learn-tp>`__ * `Device Mesh <#device-mesh>`__ * `Remote Procedure Call (RPC) distributed training <#learn-rpc>`__ +* `Monarch Framework <#learn-monarch>`__ * `Custom Extensions <#custom-extensions>`__ Read more about these options in `Distributed Overview `__. @@ -159,6 +160,22 @@ Learn RPC +++ :octicon:`code;1em` Code +.. _learn-monarch: + +Learn Monarch +---------- + +.. grid:: 3 + + .. grid-item-card:: :octicon:`file-code;1em` + Interactive Distributed Applications with Monarch + :link: https://docs.pytorch.org/tutorials/intermediate/monarch_distributed_tutorial.html + :link-type: url + + Learn how to use Monarch's actor framework + +++ + :octicon:`code;1em` Code + .. _custom-extensions: Custom Extensions