AttributeError: 'MMDistributedDataParallel' object has no attribute '_sync_params'

See original GitHub issue

I am trying to train a Detector using the latest mmcv version and torch 1.11. I installed mmcv simply by:

git clone https://github.com/open-mmlab/mmcv.git
pip install -r requirements/optional.txt
MMCV_WITH_OPS=1 pip install -e .

After launching the training script, I get an error:

Traceback (most recent call last):
  File "/workspace/Project/launch_training.py", line 190, in <module>
    main()
  File "/workspace/Project/launch_training.py", line 179, in main
    train_detector(
  File "/opt/conda/lib/python3.8/site-packages/mmdet/apis/train.py", line 208, in train_detector
    runner.run(data_loaders, cfg.workflow)
  File "/opt/conda/lib/python3.8/site-packages/mmcv/runner/epoch_based_runner.py", line 127, in run
    epoch_runner(data_loaders[i], **kwargs)
  File "/opt/conda/lib/python3.8/site-packages/mmcv/runner/epoch_based_runner.py", line 50, in train
    self.run_iter(data_batch, train_mode=True, **kwargs)
  File "/opt/conda/lib/python3.8/site-packages/mmcv/runner/epoch_based_runner.py", line 29, in run_iter
    outputs = self.model.train_step(data_batch, self.optimizer,
  File "/opt/conda/lib/python3.8/site-packages/mmcv/parallel/distributed.py", line 48, in train_step
    self._sync_params()
  File "/opt/conda/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1185, in __getattr__
    raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'MMDistributedDataParallel' object has no attribute '_sync_params'

I did some digging into the issue and I found this PR here: https://github.com/pytorch/pytorch/pull/64514 which indicates that _sync_params is deprecated from PyTorch. Is there any plan to fix this issue soon or is there a workaround this?

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:5 (2 by maintainers)

github_iconTop GitHub Comments

4reactions
teamwong111commented, Mar 18, 2022

Any updates on this? I met the same problem, and currently, it seems not straightforward to install PyTorch 1.10 other than building from the source. Thanks!

We will fix it ASAP. I have submitted a pr.

1reaction
teamwong111commented, Mar 1, 2022

Which version of PyTorch do you use? The latest stable PyTorch(1.10.2) still uses this function, as far as I know. We will handle this problem when PyTorch1.11.0 is released.

Read more comments on GitHub >

github_iconTop Results From Across the Web

AttributeError: 'MMDistributedDataParallel' object has ... - GitHub
AttributeError : 'MMDistributedDataParallel' object has no attribute '_sync_params' # ... The bug has not been fixed in the latest version.
Read more >
'DistributedDataParallel' object has no attribute 'no_sync'
Hi, I am trying to fine-tune layoutLM using with the following: distribution = {'smdistributed':{'dataparallel':{ 'enabled': True } ...
Read more >
AttributeError: 'tuple' object has no attribute - Stack Overflow
You return four variables s1,s2,s3,s4 and receive them using a single variable obj . This is what is called a tuple , obj...
Read more >
AttributeError: 'MMDistributedDataParallel' object has no ...
When trying to train models I keep getting this error: AttributeError: 'MMDistributedDataParallel' object has no attribute '_sync_params'.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found