Short, educational deep dives into the building blocks of real-time, AI-ready data systems.
One of the best mentors I ever had told me a long time ago: in high tech, if you’re standing still, you’re already falling behind.
This blog isn’t really about dinosaurs, Douglas Adams references, mammals, or comets—it’s about change and how standing still means you get left behind.
65 million years ago, dinosaurs ruled the Earth. They had been the dominant species for hundreds of millions of years. Then a massive ball of iridium and iron smashed into the planet.
Earth froze into a snowball, and cold-blooded species couldn’t survive. It would not have mattered if they had brains the size of a planet—their era was over.
From the ashes, mammals began to rise. It wasn’t just that mammals were warm-blooded— it was that they were data-processing machines. They evolved from tiny mouse-like creatures into the OI, the “Original Intelligence” that dominates Earth today.
What made mammals different was their ability to ingest huge amounts of information, filter out what wasn’t immediately needed, analyze it in real time, adapt to changing conditions, and retain knowledge for future use. No animal before them had that combination.
You can see this in action:
This capacity to combine speed, adaptability, and memory is what allowed mammals to rise and dominate where dinosaurs could not.
So, what Is Perpetual Trading???
Perpetual trading (with the unfortunate abbreviation “perps”) is the financial worlds version of a game that never ends. Unlike traditional futures contracts, there’s no expiration date — no “see you next quarter” moment. Traders can open and hold positions indefinitely, adjusting them in real time as markets move.
Instead of rolling over contracts or watching them expire into awkward settlements, perpetuals stay... well, perpetual. Prices stay anchored to the spot market through a clever mechanism called the funding rate — small periodic payments between long and short traders that keep the system balanced.
The result? A 24/7, self-stabilizing, high-velocity marketplace where liquidity never sleeps. It’s Wall Street meets esports: fast, continuous, and ruthlessly efficient.
In technology, Big Data has played the role of the dinosaur. It gave us incredible advantages. It powered breakthroughs in e-commerce personalization, fraud detection, medical research, and supply chain optimization.
But in perpetual trading, Big Data shows its limits. It can analyze massive amounts of historical data, but it doesn’t adapt in real time.
In perpetual swaps—where markets shift in milliseconds—Big Data reacts too slowly. Like the dinosaurs, it’s powerful, but not adaptive enough for the new environment.
So what’s the “new species” everyone points to? Generative AI.
People imagine it as an Einstein-level intelligence: ask it anything, and it will produce the right answer. But in reality, it’s closer to a drooling toddler without the right inputs.
The single most important aspect of intelligence is wisdom—the ability to recognize when you don’t know the answer. Generative AI literally won’t say “I don’t know the answer’. Instead, it makes something up. That’s what we call hallucinations.
In trading, hallucinations aren’t harmless—they’re expensive. An AI that “guesses” a price level or misreads sentiment can cost millions in seconds.
Which is why context is everything.
If Big Data is the dinosaur, Context Lake is the mammal. Like humans, dolphins, and bats, a Context Lake can:
This is what generative AI needs in trading. Without the right context, it hallucinates. With a Context Lake feeding it real-time signals, it can react with precision and executing strategies before the window closes.
Dinosaurs dominated the Earth for millions of years, but they couldn’t adapt when the world changed. Mammals could.
Big Data has dominated finance for decades, but in the world of perpetual trading, the environment has shifted. Markets move in milliseconds. Liquidations cascade in seconds. Arbitrage disappears in the blink of an eye.
What’s needed now is the mammalian equivalent in tech: a system that can ingest, adapt, and provide the right context—fast enough to survive and thrive in perpetuals.
If you’re standing still, you’re falling behind.
At this point, if you’re a technology executive and you don’t need to understand the technical details, you can stop reading. Just remember: DON’T BE A DINOSAUR. If you stand still, your competition will eat you alive.
If you want more technical context, please keep reading.

Here’s what it looks like in practice. A new order book event comes in:
{
"exchange": "Binance",
"symbol": "BTC-PERP",
"bid": 63985.50,
"ask": 63987.20,
"funding_rate": 0.00045,
"timestamp": "2025-09-25T10:42:17Z"
}
In GCP, this data would travel:
Pub/Sub → Dataflow → BigQuery → maybe Pinecone → trading engine.
By then, the arb window is gone.
In Tacnode, you can run one query:
SELECT o.exchange,
o.symbol,
o.bid,
o.ask,
o.funding_rate,
SUM(p.size * p.price) FILTER (
WHERE p.wallet_id = '0xABC123'
AND p.timestamp > now() - interval '10 seconds'
) AS recent_exposure,
1 - (o.embedding <=> $liquidation_patterns) AS risk_score
FROM orderbook o
JOIN positions p ON o.symbol = p.symbol
WHERE o.symbol = 'BTC-PERP'
OR p.wallet_id = '0xABC123';
That one query:
Pulls the live order book.
For example, you could score relevance like this:
ml_input = {
"bid": 63985.50,
"ask": 63987.20,
"funding_rate": 0.00045,
"recent_exposure": 1.25e6,
"risk_score": 0.92
}
relevance = ml_model.predict(ml_input)
The model might output:
{
"relevance_score": 0.87,
"action": "hedge_exposure"
}
So now, instead of acting on every order book tick, the trading agent can:
This makes the context lake not just a place for real-time joins, but also the launch pad for in-the-loop ML models that filter signal from noise as the data arrives.

In perps, every millisecond counts. By the time your Pub/Sub job buffers, your arb edge is gone. By the time BigQuery aggregates, the liquidation cascade has already rippled through the book.
A Context Lake flips that equation: one query, live joins, embeddings in memory, and ML relevance checks in-the-loop. Instead of reacting after the fact, your trading agent filters noise, acts only on high-signal ticks, and executes before the window closes.
Think of it less as “big data analysis” and more as “big reflexes.” You’re not warehousing history—you’re building a nervous system for your trading stack.
Dinosaurs stored data. Mammals processed it, adapted and survived.
Generative AI will either hallucinate like a toddler or execute like Einstein—depending on whether you feed it noise or context.
If you’re serious about perps, code like a mammal.