"category","description"
"Unknown","A problem whose category cannot be determined."
"Test","The code change is related to test code (test cases). This can be discovered via gitcproc. Similar to ""Other/Unknown"" catch-all category except that it occurs in test code."
"API Misuse","tf.function is not being used in the recommend way. An example is the use of Autograph. When AutoGraph is applied to normal Python code, you should expect no change in functionality. However, when applied to TensorFlow control flow (for example, an if statement with a tf.Tensor condition), there are certain limitations. This section describes these limitations and practices that will allow you to avoid them. (https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/autograph/g3doc/reference/limitations.md)"
"Performance","You can use tf.function to make graphs out of your programs. It is a transformation tool that creates Python-independent dataflow graphs out of your Python code. This will help you create performant and portable models, and it is required to use SavedModel. Screen reader support enabled."
"Other","Other change (e.g., syntax, refactoring, clean up)."
"Exposed variable state ","There's some problem with the program state being saved when the tf.function conversion happens."
"Representation","Constrained and unconstrained representations."
"No support for eager mode","tf.function was added because there was no support for eager mode for certain functions."
"Compilation error","AutoGraph is ""compiling"" otherwise dynamic Python code. Thus, dynamic language features may not be available."
"Graph overly specialized on input shapes","experimental_relax_shapes parameter set to False when it should have been True or the input_signature parameter is present when it should be either removed or changed to None (None is the default)."
"Unspecified input signature","Input signature is missing."
"Underspecified input signature","Input signature is missing parameters."
"Using Python iterators and generators","The use of iterators and generators inside a tf.function causes unexpected behavior related to the Python side effects tf.function limitation."
"Creating tf.Variables","tf.function only supports creating variables once, when first called, and then reusing them. You cannot create tf.Variables in new traces. Creating new variables in subsequent calls, which is currently not allowed."
"Conversion to TFLite","Unable to convert to TFLite without wrapping tf.function. One of the options to convert a TF 2.x model is using tf.lite.TFLiteConverter from concrete functions. tf.function stores the tf.Graph corresponding to that signature in a ConcreteFunction. A ConcreteFunction is a wrapper around a tf.Graph. Note: Currently, it only supports the conversion of a single concrete function.(https://www.tensorflow.org/lite/convert)"
"Numerical errors","TF's internal implementations in some cases require to be specific label dtypes and if they use incorrect labels it might be unstable and cause numerical errors."
"Graph inadequately specialized on input shapes","experimental_relax_shapes parameter set to True when it should have been False."
"Incompatibility","Incompatibility between execution modes. tf.function is used in a context that is not amenable to graph conversion. For example, the loss functions being used are not compatible, or there is a limitation of autograph."
"Redundant decorator","A tf.function decorator is removed because all call paths have calling functions already decorated with tf.function. As such, the decorator of the called function as no effect."
"TensorFlow bug","Open Issue in TensorFlow, which is being workaround"
"Obtaining concrete functions","To obtain an individual graph, use the get_concrete_function method of the callable created by tf.function. It can be called with the same arguments as func and returns a special tf.Graph object."
"Debuggability","tf.function reduces the ability to easily debug code. ""In general, debugging code is easier in eager mode than inside tf.function. You should ensure that your code executes error-free in eager mode before decorating with tf.function. To assist in the debugging process, you can call tf.config.run_functions_eagerly(True) to globally disable and reenable tf.function."" (https://www.tensorflow.org/guide/function#debugging)"
"Segmentation fault","The program crashes due to a C module accessing illegal or out-of-bounds memory. Custom code using C code generates a segmentation fault using tf.function because of incompatible compiler version which then leads to code getting called improperly. "
"Executing Python side-effects","Side effects, like printing, appending to lists, and mutating globals, can behave unexpectedly inside a Function, sometimes executing twice or not all. They only happen the first time you call a Function with a set of inputs. Afterwards, the traced tf.Graph is re-executed, without executing the Python code.

The general rule of thumb is to avoid relying on Python side effects in your logic and only use them to debug your traces. Otherwise, TensorFlow APIs like tf.data, tf.print, tf.summary, tf.Variable.assign, and tf.TensorArray are the best way to ensure your code will be executed by the TensorFlow runtime with each call."
"Input shape","Issues related to tf.function parameters input_signature and experimental_relax_shapes which deal with the input shapes for tf.function. 

experimental_relax_shapes: When True, tf.function may generate fewer, graphs that are less specialized on input shapes. 

input_signature: A possibly nested sequence of tf.TensorSpec objects specifying the shapes and dtypes of the Tensors that will be supplied to this function. If None, a separate function is instantiated for each inferred input signature. If input_signature is specified, every input to func must be a Tensor, and func cannot accept **kwargs."
"Random number generation","The developer did not use the random number generation facilities consistently with the TF Documentation. Specifically, TensorFlow provides two approaches for controlling the random number generation process. The first is through the explicit use of tf.random.Generator objects. Each such object maintains a state (in tf.Variable) that will be changed after each number generation. The other is through the purely-functional stateless random functions like tf.random.stateless_uniform. Calling these functions with the same arguments (which include the seed) and on the same device will always produce the same results."
"Deadlock","Deadlock occurs when a process or thread enters a waiting state because a requested system resource is held by another waiting process. Example in TF: A problem is the use of tf.function inside of a py_function inside of a tf.function (introduced by tf.data around dataset_map). The current TensorFlow runtime will need two interop threadpool threads (one for the outer tf.function and another for the inner tf.function) to execute such function and if N (where N is the size of the interop threadpool thread) such functions are scheduled for execution concurrently (which can happen tf.data.Dataset.map is used with num_parallel_calls), then the entire execution can hang."