Short update about the multi-backend refactor #1596
Closed
Titus-von-Koeller
announced in
Announcements
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Recent work around establishing an torch.library custom_op registration that allows us to dispatch to different hardware backends using the standard and official PyTorch dispatch mechanism has been merged to
main
and will soon be released.There are still a few small questions around torch.compile and custom_ops for the optimizer functionality, but what's there is pretty much done and will be the standard approach to register backend functions going forward. See the PR "PyTorch Custom Operator Integration #1544" and the related [RFC] #1545 for details.
Therefore, all existing code from the
multi-backend-refactor
branch needs to be migrated to this new custom_ops approach via individual PRs tomain
.Meaning, the
multi-backend-refactor
branch will become obsolete through the porting of the individual code paths therein, many of which have already been extensively tested.For each individual PR the goal is to make some of our recently cleaned up test suite pass, but of course using that backend.
Beta Was this translation helpful? Give feedback.
All reactions