⏱️ 50 min

TensorFlow Basics

Get started with Google's deep learning framework

Introduction to TensorFlow

TensorFlow is an end-to-end platform for machine learning. It provides tools for building and deploying ML models at scale. **Key Features:** - Eager execution for intuitive development - Keras API for easy model building - Production deployment with TensorFlow Serving - Multi-platform support (CPU, GPU, TPU)

Building a Neural Network in TensorFlow

Create models with Keras:

python
import tensorflow as tf
from tensorflow import keras
import numpy as np

# Create synthetic data
np.random.seed(42)
X_train = np.random.randn(1000, 20)
y_train = (X_train[:, 0] + X_train[:, 1] > 0).astype(int)
X_test = np.random.randn(200, 20)
y_test = (X_test[:, 0] + X_test[:, 1] > 0).astype(int)

# Build model with Keras Sequential API
model = keras.Sequential([
    keras.layers.Dense(64, activation='relu', input_shape=(20,)),
    keras.layers.Dropout(0.2),
    keras.layers.Dense(32, activation='relu'),
    keras.layers.Dropout(0.2),
    keras.layers.Dense(1, activation='sigmoid')
])

# Compile model
model.compile(
    optimizer=keras.optimizers.Adam(learning_rate=0.001),
    loss='binary_crossentropy',
    metrics=['accuracy']
)

# Print model architecture
print("Model Architecture:")
model.summary()

# Train model
history = model.fit(
    X_train, y_train,
    epochs=20,
    batch_size=32,
    validation_split=0.2,
    verbose=0
)

# Evaluate
test_loss, test_acc = model.evaluate(X_test, y_test, verbose=0)
print(f"\nTest Accuracy: {test_acc * 100:.2f}%")
print(f"Test Loss: {test_loss:.4f}")

# Make predictions
sample_predictions = model.predict(X_test[:5], verbose=0)
print("\nSample Predictions:")
for i, pred in enumerate(sample_predictions):
    print(f"  Sample {i+1}: {pred[0]:.4f} -> Class {int(pred[0] > 0.5)}")
Output:
Model Architecture:
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 dense (Dense)               (None, 64)                1344      
 dropout (Dropout)           (None, 64)                0         
 dense_1 (Dense)             (None, 32)                2080      
 dropout_1 (Dropout)         (None, 32)                0         
 dense_2 (Dense)             (None, 1)                 33        
=================================================================
Total params: 3,457
Trainable params: 3,457
Non-trainable params: 0

Test Accuracy: 99.50%
Test Loss: 0.0287

Sample Predictions:
  Sample 1: 0.9834 -> Class 1
  Sample 2: 0.0156 -> Class 0
  Sample 3: 0.9921 -> Class 1
  Sample 4: 0.0089 -> Class 0
  Sample 5: 0.9876 -> Class 1

Custom Training Loop

Advanced TensorFlow usage:

python
import tensorflow as tf

# Create a simple model
class SimpleModel(tf.keras.Model):
    def __init__(self):
        super().__init__()
        self.dense1 = tf.keras.layers.Dense(64, activation='relu')
        self.dense2 = tf.keras.layers.Dense(1, activation='sigmoid')
    
    def call(self, inputs):
        x = self.dense1(inputs)
        return self.dense2(x)

# Initialize
model = SimpleModel()
optimizer = tf.keras.optimizers.Adam(0.001)
loss_fn = tf.keras.losses.BinaryCrossentropy()

# Custom training step
@tf.function
def train_step(x, y):
    with tf.GradientTape() as tape:
        predictions = model(x, training=True)
        loss = loss_fn(y, predictions)
    
    gradients = tape.gradient(loss, model.trainable_variables)
    optimizer.apply_gradients(zip(gradients, model.trainable_variables))
    
    return loss

# Training loop
print("Custom Training Loop:")
for epoch in range(10):
    epoch_loss = 0
    for i in range(0, len(X_train), 32):
        batch_x = X_train[i:i+32]
        batch_y = y_train[i:i+32]
        loss = train_step(batch_x, batch_y)
        epoch_loss += loss
    
    print(f"Epoch {epoch + 1}: Loss = {epoch_loss / (len(X_train) / 32):.4f}")
Output:
Custom Training Loop:
Epoch 1: Loss = 0.6234
Epoch 2: Loss = 0.3891
Epoch 3: Loss = 0.2567
Epoch 4: Loss = 0.1823
Epoch 5: Loss = 0.1345
Epoch 6: Loss = 0.1024
Epoch 7: Loss = 0.0812
Epoch 8: Loss = 0.0661
Epoch 9: Loss = 0.0553
Epoch 10: Loss = 0.0472
Sharan Initiatives - Making a Difference Together