563207ae6bd9a8565243b04dcd92af9c33bac524,deepchem/models/keras_model.py,KerasModel,fit_generator,#KerasModel#Any#Any#Any#Any#Any#Any#Any#,246
Before Change
outputs = [outputs[i] for i in self._loss_outputs]
batch_loss = loss(outputs, labels, weights)
if variables is None:
vars = self.model.trainable_variables
else:
vars = variables
grads = tape.gradient(batch_loss, vars)
self._tf_optimizer.apply_gradients(zip(grads, vars))
self._global_step.assign_add(1)
current_step = self._global_step.numpy()
After Change
if loss is None:
loss = self._loss_fn
var_key = None
if variables is not None:
var_key = tuple(v.experimental_ref() for v in variables)
// The optimizer creates internal variables the first time apply_gradients()
// is called for a new set of variables. If that happens inside a function
// annotated with tf.function it throws an exception, so call it once here.
zero_grads = [tf.zeros(v.shape) for v in variables]
self._tf_optimizer.apply_gradients(zip(zero_grads, variables))
if var_key not in self._gradient_fn_for_vars:
self._gradient_fn_for_vars[var_key] = self._create_gradient_fn(variables)
apply_gradient_for_batch = self._gradient_fn_for_vars[var_key]
time1 = time.time()
In pattern: SUPERPATTERN
Frequency: 3
Non-data size: 4
Instances Project Name: deepchem/deepchem
Commit Name: 563207ae6bd9a8565243b04dcd92af9c33bac524
Time: 2020-02-12
Author: peastman@stanford.edu
File Name: deepchem/models/keras_model.py
Class Name: KerasModel
Method Name: fit_generator
Project Name: drckf/paysage
Commit Name: 544de8b887d171dc283ea8e5d9d7aa551fde1c95
Time: 2017-04-14
Author: charleskennethfisher@gmail.com
File Name: paysage/fit.py
Class Name: StochasticGradientDescent
Method Name: train
Project Name: scikit-image/scikit-image
Commit Name: aa9d770505d914aea76888e5eb11b5ddad54ce77
Time: 2016-07-16
Author: juan.n@unimelb.edu.au
File Name: skimage/feature/corner.py
Class Name:
Method Name: hessian_matrix