Now, we must learn what other operations we can add to a TensorFlow graph.
Declaring operations
Getting ready
Besides the standard arithmetic operations, TensorFlow provides us with more operations that we should be aware of and how to use them before proceeding. Again, we can create a graph session by running the following code:
import tensorflow as tf sess = tf.Session()
How to do it...
TensorFlow has the standard operations on tensors, that is, add(), sub(), mul(), and div(). Note that all of the operations in this section will evaluate the inputs element-wise, unless specified otherwise:
- TensorFlow provides some variations of div() and the relevant functions.
- It is worth mentioning that div() returns the same type as the inputs. This means that it really returns the floor of the division (akin to Python 2) if the inputs are integers. To return the Python 3 version, which casts integers into floats before dividing and always returning a float, TensorFlow provides the truediv() function, as follows:
print(sess.run(tf.div(3, 4))) 0 print(sess.run(tf.truediv(3, 4))) 0.75
- If we have floats and want integer division, we can use the floordiv() function. Note that this will still return a float, but it will be rounded down to the nearest integer. This function is as follows:
print(sess.run(tf.floordiv(3.0,4.0))) 0.0
- Another important function is mod(). This function returns the remainder after division. It is as follows:
print(sess.run(tf.mod(22.0, 5.0))) 2.0
- The cross product between two tensors is achieved by the cross() function. Remember that the cross product is only defined for two three-dimensional vectors, so it only accepts two three-dimensional tensors. The following code illustrates this use:
print(sess.run(tf.cross([1., 0., 0.], [0., 1., 0.]))) [ 0. 0. 1.0]
- Here is a compact list of the more common math functions. All of these functions operate element-wise:
abs() |
Absolute value of one input tensor |
ceil() |
Ceiling function of one input tensor |
cos() |
Cosine function of one input tensor |
exp() |
Base e exponential of one input tensor |
floor() |
Floor function of one input tensor |
inv() |
Multiplicative inverse (1/x) of one input tensor |
log() |
Natural logarithm of one input tensor |
maximum() |
Element-wise max of two tensors |
minimum() |
Element-wise min of two tensors |
neg() |
Negative of one input tensor |
pow() |
The first tensor raised to the second tensor element-wise |
round() |
Rounds one input tensor |
rsqrt() |
One over the square root of one tensor |
sign() |
Returns -1, 0, or 1, depending on the sign of the tensor |
sin() |
Sine function of one input tensor |
sqrt() |
Square root of one input tensor |
square() |
Square of one input tensor |
- Specialty mathematical functions: There are some special math functions that get used in machine learning that are worth mentioning, and TensorFlow has built-in functions for them. Again, these functions operate element-wise, unless specified otherwise:
digamma() |
Psi function, the derivative of the lgamma() function |
erf() |
Gaussian error function, element-wise, of one tensor |
erfc() |
Complementary error function of one tensor |
igamma() |
Lower regularized incomplete gamma function |
igammac() |
Upper regularized incomplete gamma function |
lbeta() |
Natural logarithm of the absolute value of the beta function |
lgamma() |
Natural logarithm of the absolute value of the gamma function |
squared_difference() |
Computes the square of the differences between two tensors |
How it works...
It is important to know which functions are available to us so that we can add them to our computational graphs. We will mainly be concerned with the preceding functions. We can also generate many different custom functions as compositions of the preceding, as follows:
# Tangent function (tan(pi/4)=1) print(sess.run(tf.tan(3.1416/4.))) 1.0
There's more...
If we wish to add other operations to our graphs that are not listed here, we must create our own from the preceding functions. Here is an example of an operation that wasn't used previously that we can add to our graph. We chose to add a custom polynomial function, , using the following code:
def custom_polynomial(value): return tf.sub(3 * tf.square(value), value) + 10
print(sess.run(custom_polynomial(11))) 362