"fix","description" "Remove tf.function decorator","Remove the tf.function decorator because it was not providing the benefits it was supposed to." "Use tf.function","Add the tf.function decorator to gain its benefits. " "Re-add tf.function decorator","Remove the decorator but then re-add it to the code." "Add state as function parameter","A function works in python but not with tf.function, complaining undefined variable. Solution is to add a new parameter to function and specifically pass it." "Avoid incompatible code in graph mode","tf.eagerly_executing() is used to see if eager execution is enabled. Using tf.eagerly_executing() to check whether we are in graph mode, and if so, do a specific behavior for graph mode (e.g. only print shapes and dtype, not values)" "Embed converted function declaration in calling function","This way, the Python function being converted by tf.function() (transitively, in some cases) now has access to the callers variables, i.e., they are now in-scope statically where they may have been in scope dynamically before." "None","The issue is still open and there is currently no fix." "Change tf.function argument","The user had added an incorrect value to a tf.function parameter." "Add input_signature argument to tf.function","The user specified an input_signature of the tf.function parameter. A possibly nested sequence of tf.TensorSpec objects specifying the shapes and dtypes of the Tensors that will be supplied to this function. If None, a separate function is instantiated for each inferred input signature. If input_signature is specified, every input to func must be a Tensor, and func cannot accept **kwargs. (https://www.tensorflow.org/api_docs/python/tf/function)" "Use specialized tf.data.Iterator for iteration constructs.","Iterators advance only once during tracing in a tf.function block. The specialized tf.data.Iterator represents an iterator to enumerate elements and return the next element if available." "Added tf.Variable","In general, it is recommended to create stateful objects like tf.Variable outside of tf.function and passing them as arguments. (https://www.tensorflow.org/api_docs/python/tf/function)" "Add conditions for each execution mode","Make sure that you are using tf.function and its chosen parameters for the correct execution mode e.g. graph mode, compiled mode, eager mode" "Use type alias","There is a misinterpretation of type(...) function and add type alias to fix problem." "Change input argument to tf.Tensor instead of Python object","Input arguments of function decorated by tf.function as tf.Tensor instead of pure Python object to reduce retracing." "Change value of epsilon ","Changing the value of epsilon to avoid inconsistency since there was a bug in constant folding. " "Add error/warnings to code","Add error or warning messages to inform users to avoid certain behavior with tf.function. This is not really a ""fix"" but more like a way to avoid fixing the problem." "Use Tensorflow ops, instead of NumPy and Python calls","tf.function works best with TensorFlow ops; NumPy and Python calls are converted to constants." "Upgrade to new version of library","Install new version of library that fixes the issue/problem." "Use iter in a loop inside tf.function when iterating over Datasets","Use iter(dataset) because AutoGraph translates into a tf.while_loop instead of a Dataset.reduce which does not support gradients" "Relocate tf.function (use on a different function)","The correct function is not decorated with tf.function therefore they need to relocate the tf.function." "Use tf.numpy_function inside tf.function","You can't use numpy inside a tf.function. The function will be compiled once. If you have to use numpy inside a tf.function you need tf.numpy_function." "Pass the tensor to a function outside the scope of the @tf.function decorator","Tf.function has some issues with other functions (eg. .numpy()), so in order to use both functions you have to pass the tensor to a function outside the scope of the @tf.function decorator" "Hardcode arguments","In Eager mode, a Tensor input is as good as static as we know its value but in tf.function it is not supported. Therefore, we need to hardcode the arguments. " "Move or remove a function invocation inside of a tf.function outside","Incompatible function are moved outside tf.function to get be able to use both the function and the tf.function or remove the incompatible function for the code" "Use tf.map_fn for list comprehension","Autograph does not support list comprehension but you can use tf.map_fn as a workaround for that. " "Instead of having variables, convert to tensors ","TF complains about having variables therefore they convert to tensors so that they can use tf.function" "Use TF functions instead of Keras functions","In TF, graphs can be used only TF functions (tf.matmul(), tf.add(), and so on), but not Keras." "Use function that works with Autograph","Some functions have inconsistent behavior due to functions inside of tf.function because of Autograph, so we need to use other functions that provide that consistency" "Use TensorArray instead of list","A common pattern is to accumulate intermediate values from a loop. Normally, this is accomplished by appending to a Python list or adding entries to a Python dictionary. However, as these are Python side effects, they will not work as expected in a dynamically unrolled loop. Use tf.TensorArray to accumulate results from a dynamically unrolled loop." "Replace tf.constant with tf.convert_to_tensor","tf.convert_to_tensor can accept both Tensor type and python native types, not like tf.constant" "Use tf.random.experimental.Generator/tf.random.Generator ","The old RNGs from TF 1.x such as tf.random.uniform and tf.random.normal are not yet deprecated but strongly discouraged. tf.random.Generator obeys the same rules as tf.Variable when used with tf.function" "Use tf.data.Dataset.from_tensor_slices to use another type of dataset","Using tf.Data.from_tensor_slices creates a Dataset whose elements are slices of given tensors. (https://www.tensorflow.org/api_docs/python/tf/data/Dataset#from_tensor_slices)" "Remove loop from tf.function","With the loop in tf.function, there is a graph placement issue and and code breaks" "Use tf.shape()","Use tf.shape() instead of accessing a tensor's shape directly" "Use Custom Keras Model Subclass","Replace custom model (not Keras) with a a custom Keras Model Subclass " "Add conditions for supported and unsupported XLA","There is a unsupported op which is not supported in XLA, create a function where you skip the XLA compilation if not supported." "Create a new tf.function for each optimizer","If you need to change the optimizer during training, a workaround is to create a new Function for each optimizer, calling the ConcreteFunction directly. (https://www.tensorflow.org/guide/function#using_with_multiple_keras_optimizers)" "Use tf.gather_nd","This function gathers slices from params into a Tensor with shape specified by indices. This is used to convert numpy to tf.function compatible code. " "Provide datatype specification","User gives the data types before running in graph mode. A lot of information has to be pre-known for tf.data in order to build a true data pipeline (since tf.data is always in graph mode). That promotes the idea of ""parsing the metadata"" automatically in eager mode, then doing further processing in graph mode (to have an efficient pipeline). " "Use tf.print()","Usual logging is disabled in graph mode, so only way you have to display a message is to use the internal tf.print() function. tf.print calls will execute every time, and can help you track down intermediate values during execution. https://www.tensorflow.org/guide/function#debugging "