620ce4bd12b6379b0b2c184aa8f2f2e066cbe48a,pytext/optimizer/fp16_optimizer.py,DynamicLossScaler,check_overflow,#DynamicLossScaler#Any#,90

Before Change


    def check_overflow(self, master_params):
        self.is_overflow = False
        for p in generate_params(master_params):
            if p.grad is not None:
                cpu_sum = float(p.grad.float().sum())
                if (
                    cpu_sum == float("inf")
                    or cpu_sum == -float("inf")
                    or cpu_sum != cpu_sum

After Change


        self.is_overflow = False
        for p in generate_params(params):
            self.check_overflow_(p.grad)
            if self.is_overflow:
                break

    def update_scale(self):
        rAccording to overflow situation, adjust loss scale.

        Once overflow happened, we decrease the scale by scale_factor.
Italian Trulli
In pattern: SUPERPATTERN

Frequency: 3

Non-data size: 4

Instances


Project Name: facebookresearch/pytext
Commit Name: 620ce4bd12b6379b0b2c184aa8f2f2e066cbe48a
Time: 2019-08-11
Author: yuqingl@fb.com
File Name: pytext/optimizer/fp16_optimizer.py
Class Name: DynamicLossScaler
Method Name: check_overflow


Project Name: facebookresearch/pytext
Commit Name: 620ce4bd12b6379b0b2c184aa8f2f2e066cbe48a
Time: 2019-08-11
Author: yuqingl@fb.com
File Name: pytext/optimizer/fp16_optimizer.py
Class Name: DynamicLossScaler
Method Name: check_overflow


Project Name: cesium-ml/cesium
Commit Name: 384e8e0f8a91815568aa3d4c651c8c5e48979262
Time: 2015-12-17
Author: a.crellinquick@gmail.com
File Name: mltsp/util.py
Class Name:
Method Name: cast_model_params


Project Name: oddt/oddt
Commit Name: 55cd8a43035d0a2d1e93d711a92ad7994e78e7b7
Time: 2018-02-16
Author: mwojcikowski@users.noreply.github.com
File Name: oddt/toolkits/extras/rdkit.py
Class Name:
Method Name: PDBQTAtomLines