+
Skip to content

Tags: pruthvistony/pytorch

Tags

v1.6.0-rc3

Toggle v1.6.0-rc3's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
Update pthreadpool to pthreadpool:029c88620802e1361ccf41d1970bd5b07fd…

…6b7bb. (pytorch#40524) (pytorch#41190)

Summary: Pull Request resolved: pytorch#40524

Reviewed By: ezyang

Differential Revision: D22215742

Pulled By: AshkanAliabadi

fbshipit-source-id: ef594e0901337a92b21ddd44e554da66c723eb7c

v1.6.0-rc2

Toggle v1.6.0-rc2's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
Release GIL during DDP construction. (pytorch#40877)

Summary:
Pull Request resolved: pytorch#40495

As part of debugging flaky ddp_under_dist_autograd tests, I realized
we were running into the following deadlock.

1) Rank 0 would go into DDP construction, hold GIL and wait for broadcast in
DDP construction.
2) Rank 3 is a little slower and performs an RRef fetch call before the DDP
construction.
3) The RRef fetch call is done on Rank 0 and tries to acquire GIL.
4) We now have a deadlock since Rank 0 is waiting for Rank 3 to enter the
collective and Rank 3 is waiting for Rank 0 to release GIL.
ghstack-source-id: 106534442

Test Plan:
1) Ran ddp_under_dist_autograd 500 times.
2) waitforbuildbot

Differential Revision: D22205180

fbshipit-source-id: 6afd55342e801b9edb9591ff25158a244a8ea66a

Co-authored-by: Pritam Damania <pritam.damania@fb.com>

v1.6.0-rc1

Toggle v1.6.0-rc1's commit message

Verified

This commit was signed with the committer’s verified signature. The key has expired.
seemethere Eli Uriegas
.circleci: Fix upload to backup directory

Signed-off-by: Eli Uriegas <eliuriegas@fb.com>

v1.5.1

Toggle v1.5.1's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
[ONNX] Fix pow op export [1.5.1] (pytorch#39791)

* [ONNX] Fix pow op export (pytorch#38065)

Summary:
Fix pow type cast for opset 9 and update opset 12
Pull Request resolved: pytorch#38065

Differential Revision: D21485353

Pulled By: malfet

fbshipit-source-id: 3993e835ffad07b2e6585eb5cf1cb7c8474de2ec

* Update ort-nighly version as suggested in pytorch#39685 (comment)

* Apply changes from pytorch#37846 to  `test_topk_smallest_unsorted`

Co-authored-by: neginraoof <neginmr@utexas.edu>

v1.5.1-rc1

Toggle v1.5.1-rc1's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
[ONNX] Fix pow op export [1.5.1] (pytorch#39791)

* [ONNX] Fix pow op export (pytorch#38065)

Summary:
Fix pow type cast for opset 9 and update opset 12
Pull Request resolved: pytorch#38065

Differential Revision: D21485353

Pulled By: malfet

fbshipit-source-id: 3993e835ffad07b2e6585eb5cf1cb7c8474de2ec

* Update ort-nighly version as suggested in pytorch#39685 (comment)

* Apply changes from pytorch#37846 to  `test_topk_smallest_unsorted`

Co-authored-by: neginraoof <neginmr@utexas.edu>

v1.5.0

Toggle v1.5.0's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
[v.1.5.0] Ensure linearIndex of advanced indexing backwards is contig… (

pytorch#36962)

* [v.1.5.0] Ensure linearIndex of advanced indexing backwards is contiguous.

This is a more straightforward solution to the problem than pytorch#36957; I don't know about the relative performance.

Fixes: pytorch#36956

ghstack-source-id: 43c48ea
Pull Request resolved: pytorch#36959

* Fix test.

v1.5.0-rc5

Toggle v1.5.0-rc5's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
[v.1.5.0] Ensure linearIndex of advanced indexing backwards is contig… (

pytorch#36962)

* [v.1.5.0] Ensure linearIndex of advanced indexing backwards is contiguous.

This is a more straightforward solution to the problem than pytorch#36957; I don't know about the relative performance.

Fixes: pytorch#36956

ghstack-source-id: 43c48ea
Pull Request resolved: pytorch#36959

* Fix test.

v1.5.0-rc4

Toggle v1.5.0-rc4's commit message
make simple executor the default for OSS

v1.5.0-rc3

Toggle v1.5.0-rc3's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
Use counter instead of vector of futures in `_parallel_run` (pytorch#…

…36159) (pytorch#36334)

Summary:
This should be faster than allocating one mutex, flag and conditional variable per task.

Using `std::atomic<size_t>` to count remaing tasks is not sufficient,
because modification of remaining counter and signalling conditional variable must happen atomically,
otherwise `wait()` might get invoked after `notify_one()` was called.
Pull Request resolved: pytorch#36159

Test Plan: CI

Differential Revision: D20905411

Pulled By: malfet

fbshipit-source-id: facaf599693649c3f43edafc49f369e90d2f60de
(cherry picked from commit 986a8fd)
Signed-off-by: Eli Uriegas <eliuriegas@fb.com>

Co-authored-by: Nikita Shulga <nshulga@fb.com>

v1.5.0-rc2

Toggle v1.5.0-rc2's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
Revert "Fix handling of non-finite values in topk (pytorch#35253)" (p…

…ytorch#35582)

This reverts commit b12579d.

This patch in-and-of itself looks fine, but it's causing some AMP tests to fail.
点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载