Sequential Data Types
Learn about different types of sequential data and their characteristics.
Time Series
Text
Speech
Sequential data has an inherent order where the position of elements matters. Examples include time series (stock prices, weather), text (words in sentences), speech (audio signals), and biological sequences (DNA, proteins).
Temporal Dependencies
Understand how elements in sequences depend on previous elements.
Short-term Dependencies: Current element depends on recent past
Long-term Dependencies: Current element depends on distant past
Variable Dependencies: Dependency length varies across sequences
Context Sensitivity: Meaning changes based on surrounding elements
import numpy as np
import matplotlib.pyplot as plt
def demonstrate_temporal_dependencies():
"""Show examples of temporal dependencies in sequences"""
print("=== TEMPORAL DEPENDENCIES EXAMPLES ===")
# Example 1: Short-term dependency (moving average)
print("\\nš Short-term Dependency Example:")
np.random.seed(42)
raw_data = np.random.randn(100)
window_size = 3
# Moving average depends on last 3 values
moving_avg = []
for i in range(window_size, len(raw_data)):
avg = np.mean(raw_data[i-window_size:i])
moving_avg.append(avg)
print(f" Current value depends on previous {window_size} values")
print(f" Example: MA at t=5 = mean of values at t=[2,3,4]")
# Example 2: Long-term dependency (cumulative sum)
print("\\nš Long-term Dependency Example:")
cumsum = np.cumsum(raw_data)
print(f" Cumulative sum depends on ALL previous values")
print(f" Value at t=50 influenced by every value from t=0 to t=49")
# Example 3: Text dependencies
print("\\nš Text Dependency Examples:")
examples = {
"Short-term": {
"text": "The cat sat on the ___",
"dependency": "Next word depends on 'the' (immediate context)"
},
"Long-term": {
"text": "John went to the store. He bought milk. Later, ___ went home.",
"dependency": "'He' refers to 'John' from much earlier"
}
}
for dep_type, example in examples.items():
print(f" {dep_type}:")
print(f" Text: {example['text']}")
print(f" Dependency: {example['dependency']}")
return raw_data, moving_avg, cumsum
# Run demonstration
data, ma, cs = demonstrate_temporal_dependencies()
print("\\n=== WHY RNNS ARE NEEDED ===")
rnn_advantages = [
"š§ Memory: Can remember previous inputs",
"š Recurrence: Process sequences of variable length",
"š Patterns: Learn temporal patterns and dependencies",
"šÆ Context: Use context to make better predictions",
"ā” Efficiency: Share parameters across time steps"
]
for advantage in rnn_advantages:
print(f" {advantage}")
Variable-Length Sequences
Handle sequences of different lengths, a key challenge in sequential modeling.
Real-world sequences vary in length: sentences have different word counts, time series have different durations, and audio clips have different lengths. RNNs naturally handle this variability.
# Variable-length sequence handling
def demonstrate_variable_length_sequences():
"""Show challenges and solutions for variable-length sequences"""
print("=== VARIABLE-LENGTH SEQUENCE CHALLENGES ===")
# Example sentences of different lengths
sentences = [
"Hello", # Length: 1
"How are you?", # Length: 3
"This is a longer sentence", # Length: 5
"Machine learning with neural networks" # Length: 5
]
# Tokenize sentences
tokenized = [sentence.split() for sentence in sentences]
print("š Example Sentences:")
for i, (original, tokens) in enumerate(zip(sentences, tokenized)):
print(f" {i+1}. '{original}' ā {tokens} (length: {len(tokens)})")
# Show the problem with traditional neural networks
print("\\nā Traditional Neural Network Problem:")
print(" - Fixed input size required")
print(" - Need to pad or truncate sequences")