Skip to content

Conversation

@dependabot
Copy link

@dependabot dependabot bot commented on behalf of github Jul 24, 2024

Bumps torchrec from 0.2.0 to 0.8.0.

Release notes

Sourced from torchrec's releases.

v0.8.0

New Features

In Training Embedding Pruning (ITEP) for more efficient RecSys training

Provides a representation of In-Training Embedding Pruning, which is used internally at Meta for more efficient RecSys training by decreasing memory footprint of embedding tables. Pull Request: pytorch/torchrec#2074 introduces the modules into TorchRec, with tests showing how to use them.

Mean Pooling

Mean pooling enabled on embeddings for row-wise and table-row-wise sharding types in TorchRec. Mean pooling mode done through TBE (table-batched embedding) won’t be accurate for row-wise and table-row-wise sharding types, which modify the input due to sharding. This feature efficiently calculates the divisor using caching and overlapping in input dist to implement mean pooling, which had proved to be much more performant than out-of-library implementations. PR: pytorch/torchrec#1772

Changelog

Torch.export (non-strict) compatibility with KJT/JT/KT, EBC/Quantized EBC, sharded variants #1815 #1816 #1788 #1850 #1976 and dynamic shapes #2058

torch.compile support with TorchRec #2045 #2018 #1979

TorchRec serialization with non-strict torch.export for regenerating eager sparse modules (EBC) from IR for sharding #1860 #1848 with meta functionalization when torch.exporting #1974

More benchmarking for TorchRec modules/data types #2094 #2033 #2001 #1855

More VBE support (data parallel sharding) #2093 (EmbeddingCollection) #2047 #1849

RegroupAsDict module for performance improvements with caching #2007

Train Pipeline improvements #1967 #1969 #1971

Bug Fixes and library improvements

v0.8.0-rc1

No release notes provided.

v0.7.0

No major features in this release

Changelog

  • Expanding out ZCH/MCH
  • Increased support with Torch Dynamo/Export
  • Distributed Benchmarking introduced under torchrec/distributed/benchmarks for inference and training
  • VBE optimizations
  • TWRW support for VBE (I think this happened in the last release, Josh can confirm)
  • Generalized train_pipeline for different pipeline stage overlapping
  • Autograd support for traceable collectives
  • Output dtype support for embeddings
  • Dynamo tracing for sharded embedding modules
  • Bug fixes

v0.7.0-rc1

Pre release for v0.7.0

v0.6.0

VBE

TorchRec now natively supports VBE (variable batched embeddings) within the EmbeddingBagCollection module. This allows variable batch size per feature, unlocking sparse input data deduplication, which can greatly speed up embedding lookup and all-to-all time. To enable, simply initialize KeyedJaggedTensor with stride_per_key_per_rank and inverse_indices fields, which specify batch size per feature and inverse indices to reindex the embedding output respectively.

... (truncated)

Commits
  • 9264186 Enable prefetch stage for StagedTrainPipeline (#2239)
  • 09d1ff2 benchmark of fbgemm op - regroup_kts (#2159)
  • 4f114bc Improve Composability of ITEP (#2236)
  • 7a7790b TGIF check untraced ShardedQuantEmbeddingCollection and ShardedQuantEmbedding...
  • 4c98f7b Add util function to recursively get Trec modules (#2234)
  • a68a99f Add fused compute kernel to PT2 multiprocess test (#2235)
  • 50ecc5c copy _permute_tensor_by_segments to avoid packaging import error (#2170)
  • fbcc7af implementation of fbgemm op - kt_regroup_arguments (#2128)
  • 5be6133 Fix import statement (#2233)
  • 6a94b87 Enable gradient clipping with inf norm, and apply it to IG CTR on APS. (#2232)
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

fcas and others added 2 commits May 22, 2024 21:10
Bumps [torchrec](https://github.com/pytorch/torchrec) from 0.2.0 to 0.8.0.
- [Release notes](https://github.com/pytorch/torchrec/releases)
- [Commits](meta-pytorch/torchrec@v0.2.0...v0.8.0)

---
updated-dependencies:
- dependency-name: torchrec
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Jul 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants