BTC#: Endianness

Series: BTC# – Learning to Program Bitcoin in C#

« Previous: Digital Signature Code

Next: SEC Serialisation »

This post is some background on how data is serialised byte-by-byte. If you already know your big end from your little end, you can skip it.

ComputingBasics.png

One of things we need to contend with when we write down (or “serialise”) the big long numbers that Bitcoin uses, is the problem of “endianness.”

Different microprocessors read and write their data in different ways, much like people do. Some human languages are written left-to-right, like English and Russian, others are written right-to-left, like Hebrew and Arabic. Likewise, some cultures write their dates day-month-year, others write year-month-day.

LanguagesAndDates.png

Britain and Europe tend to write their dates starting with the smallest unit, the day, and work their way up the largest unit, the year. In the language of computer architecture, we would call this a “little-endian” date, because it starts at the little end. China and Korea use “big-endian” dates. (In the U.S. and the Philippines, people start at the middle end. That’s cool. You do you.)

Religious Eggs-tremism

The names “big-endian” and “little-endian” come from Gulliver’s Travels and refer to the subject of a religious war between two groups of Lilliputians, who can’t agree on which end boiled eggs should be eaten from. The sacred text says that “all true believers shall break their eggs at the convenient end.” Opinions differ violently on which end is most convenient. The story is a satire on people who don’t believe in “you do you.”

EggConvenientEnd.png

Motorola vs Intel

When the first microprocessors appeared on the market, opinions differed on the best way to store multi-byte numbers. The big two microprocessors released in 1974 were the Motorola 6800 and the Intel 8080, ancestor of the x86 processor family that powered the PC revolution. Both were 8-bit processors, meaning that they handled data in chunks of 8 binary digits.

Storing numbers bigger than 255, the biggest value that can be stored in 8 bits, required multiple bytes. When multi-byte numbers were written to and read from memory, the 6800 stored the most significant bit, the “big end,” first. This is similar to how we write a multi-digit numbers: thousands first, then hundreds, then tens, then units.

Intel took the opposite approach, storing the least significant bit first.The Motorola way seems more natural to us but Intel engineers believed that their way had advantages. Operations like multiplication often process numbers from the smaller end first.

In 1974, Motorola and Intel systems didn’t need to talk to each other so it didn’t matter. Today, now that everything talks to everything, it’s crucial to know which format data is stored and transmitted in.

MotorolaIntelEndian.png

Big Integers in C#

The first place we care about this is when we need to create C# BigIntegers. If we have an array of bytes that we want to turn in a BigInteger object, we need to know whether that byte array is big-endian or little-endian. We also need to know that the BigInteger constructor that takes a byte array expects the bytes in little-endian format.

To make this simpler, I’ve created an extension method on byte[] to create a BigInteger based on a given format.

public static BigInteger ToBigInteger(this byte[] buffer, ByteArrayFormat format = ByteArrayFormat.LittleEndianSigned)
{
    if ((format & ByteArrayFormat.BigEndian) == ByteArrayFormat.BigEndian)
    {
        buffer = buffer.Reverse().ToArray();
    }
    if ((format & ByteArrayFormat.Unsigned) == ByteArrayFormat.Unsigned)
    {
        buffer = buffer.PostfixZeroes(1);
    }
    return new BigInteger(buffer);
}

This reverses the array if we’re told that it’s in big-endian form and then takes care of the sign of the number if we want to ensure that it’s positive.

Endianness in Bitcoin

In Bitcoin this problem is particularly pressing because different pieces of data are stored in different formats. For example, DER signatures are stored in big-endian form, whereas transaction hashes are stored in little-endian.

There’s no easy way to know which is which. We just need to read the documentation. Helper methods like the one above will get used a lot.

« Previous: Digital Signature Code

Next: SEC Serialisation »

Tim Whatsit

Donald Trump casually discarded 700 years of tradition last week, referring to Apple CEO Tim Cook as “Tim Apple.” What are surnames for, anyway? We all knew who he was talking about, especially since Tim Whatsit was sitting right there, keeping a straight face.

TimAppleSubtweet.png

The gaffe made Twitter lose its collective mind – admittedly a low threshold to cross. Tim Cook was celebrated for changing his Twitter name to Tim [apple emoji]. Alex Thingummybob at Wired magazine called it a “most legendary sub tweet.”

If we list mistakes made by heads of state, top to bottom, this gaffe is at the less catastrophic end. All of us talk like this all the time.

“I saw Mark for drinks last night.”
“English Mark?”
“No, rugby Mark.”

Most of us have lots of different names to different people. The Mark I know as “rugby Mark” might be “Christchurch Mark” to one person, “big Mark” to another, and “Mark next door” to someone else.

In the past, names have changed with circumstance. A child might receive a saint’s name at the time they were baptised, their ‘Christian’ name. A woman getting married would lose her patronymic, her father’s name, and gain her husband’s name.

We see a vestige of changing names for changing circumstances in the royal family. Queen Victoria was christened Alexandrina and Edward VII started off as Prince Albert.

We also use customary names for people whose official identity is unknown, like Jack the Ripper.

Add nicknames and diminutives to names that change with context and we have a complex tapestry of identity.

To insiders, all these overlapping identities provide a richly detailed social context. To outsiders, it’s completely opaque. In Seeing Like a State, James C. Scott compares the social context of customary naming techniques to the networks of little alleyways in medieval towns like Bruges. The layout has evolved for centuries to meet the needs of the locals.

InsiderOutsider.png

In contrast, the modern idea that everyone has an unchanging canonical name, registered with the government, is more like the grid layout of Manhattan. It’s dead easy for an outsider to find the intersection of 5th Avenue and 33rd Street but that doesn’t tell you anything about what’s there.

Being easy for outsiders to find is exactly what official names are all about. In England, the idea of inherited surnames arrived with the Normans. After the Norman conquest, King William wanted a stocktake of his new possessions. He ordered a survey of all the lands in England, what’s now known as the Domesday Book.

When the King wants to know exactly who owns what, it’s usually because he plans to take some of it. Knowing the ‘what’ tells the King how big the tax yield might be. Knowing the ‘who’ tells the King who to hit up for the cash. Stable identity, legible to outsiders, is essential to knowing the ‘who.’ In William’s day, tax was tied to land and so surnames were only common among the aristocracy.

Peasants survived another three centuries without surnames. In 1380 the English economy was in turmoil. Constant war with France bankrupted the Crown and the aftermath of the Black Death meant labour was in short supply. King Richard II announced a poll tax.

PeasantsRevolt.png

Unlike King William’s land taxes, Richard’s poll tax fell on everyone. To administer the tax, county rolls were used to list everyone subject to the tax, and official names were recorded to make the population visible to the tax collectors. The Peasants’ Revolt, led by Wat Tyler and John Bull (or just Wat and John as they would have preferred to be known), marched on London in 1381, burning the county rolls as they went. Eventually the Revolt was put down, the rolls were redrawn, and surnames gradually became a part of everyday life.

A fixed official name in a government register is like a database ID, a unique identifier that allows a ruler to quantify his property. A list of literal “human resources” is essential for mass taxation and conscription, the financial and military fuel of the modern nation state.

As nation states consolidated their power from the fourteenth to sixteenth centuries, surnames spread with them. As those nation states spread across the globe forming empires, surnames followed. In the Philippines, for example, surnames were mandated on November 21, 1849. Scott, in Seeing Like a State, describes how those who didn’t have surnames were assigned them from a list. People in the provincial capital got surnames starting with A. The Bs and Cs and Es through to Ls were assigned along the coast, M- and S-names went up valleys and the end of the alphabet was assigned to the islands.

Official names have since been supplemented and superseded by passports, IRD numbers, fingerprints, DNA profiles, and facial recognition software, but surnames are what started us down the path of fixed identity and were essential to the rise of the nation state.

FiledStamped.png

Some have speculated that the technological forces that helped Trump to power are undermining democracy and paving the way for a techno-feudalism. No wonder that tech baron Tim Apple was so good-natured about Trump’s disregard for one of the nation state’s most ingrained institutions.

Measles Goes Viral

Mongol soldiers who died of bubonic plague during the siege of Kaffa were hurled over the city walls with trebuchets in a bid to spread the disease to their enemies. Genoese historian Gabriele de’ Mussi claimed in The Great Dying of the Year of our Lord 1348 that this was the trigger for the Black Death in Europe, which killed 25 million people.

mongols-catapult.jpg

British soldiers used smallpox against Native Americans during the French and Indian War in 1763, and probably targeted Australian Aboriginals with smallpox in New South Wales in 1789.

All major powers conducted biological warfare research during the 20th century despite the use of biological weapons being banned by the Geneva Protocol in 1925 and their production and stockpiling being banned by the Biological Weapons Convention in 1972.

The World War III that gripped imaginations for four decades was one of massed infantry, with Soviet and American tank divisions pounding each other, quickly followed by nuclear holocaust. Horrific as that would have been, biological and chemical weapons were considered beyond the pale – too nasty to use.

McLuhanInfoWar.png

The World War III that we got instead was that described by Marshall McLuhan: “World War III is a guerrilla information war with no division between military and civilian participation.”

Modern low-grade information warfare doesn’t respect any of the niceties of traditional war between nation states. There are no uniforms, no distinction between soldiers and non-combatants, no prisoners are taken, no respect for borders, and no prohibition on the spreading of disease.

The measles outbreaks in Canterbury, Waikato, and Auckland, like others in Washington and California, are one more front in the information war pulling our political systems apart.

Social media have allowed people with fringe beliefs to find each other and recommendations algorithms tuned for virality push people to more and more material that reinforces those beliefs. Platforms like Facebook and Twitter have broken the manufactured consent that held mainstream politics together for decades. The disruption of the party line brought us Trump and Brexit. By the same mechanism, it’s no coincidence that recruitment of ISIS brides, flat-earthers, and anti-vaxxers has boomed at the same time. Measles has gone viral.

Anti-vaxxers are an eclectic bunch – some on the far left who hate “Big Pharma”, some on the far right who see mass vaccination as a government conspiracy. The original anti-vaxxers arrived alongside Edward Jenner’s original vaccines, opposed to the inoculation of humans with cow pox. Objections were often religious, claiming that preventing disease was interfering with God’s will. Satirical cartoons showed victims of inoculation sprouting bovine features from their arms, buttocks, and faces.

The_cow_pock.jpg

The number of people opting out of vaccination has rocketed upwards in the last twenty years. In parts of Los Angeles, some schools have opt-out rates of 60 to 70 percent. The vaccination rate in West L.A. is now as low as in South Sudan. South Sudan’s excuse is that it’s a poverty-stricken failed state in the midst of a civil war. In Beverly Hills, the local folk medicine prescribes kale smoothies as a cure for all ailments.

The foot soldiers in an information war often don’t even know that they’re participants in something bigger. They may simply be passionate about a single issue, pro or con. They may enjoy winding people up on the Internet for entertainment. They may be failing B-grade celebrities looking for a bandwagon to hitch their flagging fortunes to.

Others know exactly what they’re up to. Russian trolls are manufacturing a vaccination debate to sow discord in American society. They’re neither pro- nor anti-vaccination but argue both sides in an attempt to create division and distrust.

Facebook advertising is cheap and frighteningly effective, enabling strife in a way that wasn’t possible prior to the rise of social media. During the Cold War, fifty gigatons of Soviet nuclear warheads did nothing to reduce American power. These days, Russian intelligence agencies can organise riots on American streets with $200 worth of Facebook ads.

In May 2016, Heart of Texas, a front for Russian intelligence, organised a protest to “Stop the Islamization of Texas.” United Muslims of America, another Russian intelligence front, organised a rally to “Save Islamic Knowledge,” at the same time and place. Hilarity ensued.

Analysis of bot-generated vaccination tweets shows an even split between pro- and anti-vaccination positions, as if the perpetrators are more interested in continuing the fight than in one side winning. Many of the tweets tie their reasoning to God, constitutional rights, or animal welfare – prefering topics that are already divisive, compounding the disruptive effect.

How do we make sense of things in a world where the manufacture of propaganda is cheap, easy, and global? Today, everyone is a publisher and instant communication connects everyone on the planet. We have to learn to live in an information environment unlike anything we’ve seen before.

Countering disinformation is extremely difficult. Throwing more data and more studies at sceptics is counterproductive. Explaining things to non-believers just reinforces their beliefs. The truth may be the only thing some people are immune to. But we can’t suppress disinformation either. Censorship is neither moral nor practical.

Marshall McLuhan’s guerrilla information war is in full swing. There’s no protection for non-combatants, no protection for children, and no Biological Weapons Convention. Spreading disease is just another day at the office for a Twitter bot.

Make Christendom Great Again

“Trump is Hitler,” screamed the more excitable parts of the media throughout 2016. The logic seems to be: everything unlikeable is “fascism”; Trump is especially unlikeable and therefore must be especially fascist. You don’t get more fascist than Hitler, ergo …

This is reflex rather than thought. If we’re really looking to compare Trump to an historical German, Martin Luther is a much better candidate.

I recently watched a CNN series on Netflix called Race for the White House. It covers six presidential election races from the last couple of centuries, usually races where an underdog has won, and examines what happened. In two of the six cases, the winner was not just an unlikely candidate, but an entirely new political party emerged.

More often than not, a change in communications technology has led to the upset, although this wasn’t explicitly called out by the show. CNN may not want to admit that dominant media can rapidly become irrelevant.

PresidentsAndMedia.png

In 1828 Andrew Jackson became the first ever Democratic Party president, beating out the incumbent John Quincy Adams. Adams was an old-style aristocratic leader, not interested in anything as crass as campaigning. Jackson knew that a popular election in a democracy could be won by appealing to people and used newspapers effectively for the first time.

The Republican Party got its first president in 1860 – Abraham Lincoln. Lincoln was a captivating speaker, but his team also knew how to get his words out quickly to a national audience using the telegraph.

JFK was famously the first television president, beating out Richard Nixon, who was clearly uncomfortable with the new medium.

Jackson and Lincoln both used new technologies to bring new parties into power. JFK didn’t bring a new party into power, but the new civil rights era Democratic Party was a new thing in an old skin, nothing like the Democratic Party of the Old South.

Likewise, the Republican Party brought into power by Donald Trump is not the Republican Party of old, but a new creature that has taken over the body of its host. He, too, has used a new communications medium, Twitter, to launch his insurgency.

The archetypal example of a new communications technology overthrowing incumbents is, of course, the Reformation. A dominant institution, the Catholic Church, and its political system, feudalism, met their match in Martin Luther. Luther and his Protestants used Gutenberg’s newly-invented printing press to great effect, producing huge numbers of pamphlets written in German rather than Latin to spread their message.

Luther was a 16th-century shitposter extraordinaire. Infamous for his short temper and way with words, Luther would have nailed Twitter.

LutherTweet.png

Trump and his supporters see the modern political system as decayed and corrupt, in much the same way as the Church was seen five hundred years ago. They want to see America taken back to its glory days, some mythical bygone golden age, to “Make America Great Again”.

Luther set off a revolution that created the modern world, but that’s not what he was trying to do. In a similar vein to Trump, he thought that the “modern” (i.e. 16th century) church had become decayed and corrupt. Despite his fascination with Gutenberg’s press, the most modern communications tool available in late-medieval Europe, his revolution, like Trump’s, was backwards-looking. He wanted to revert the Church to its lost golden age. He wanted to “Make Christendom Great Again”.

Ultimately, Luther set Europe on a track that led to the world as we know it today – nation states, democracy, and the scientific revolution. But that was completely unintended. And to come out the other side, Europe had to suffer through the Thirty Years’ War, a war that killed a quarter of the population of Germany – half the population in some areas – and many more outside Germany. Its embers still flicker.

As our decrepit, debt-ridden democratic welfare states totter toward their failure, we have no idea what’s going to be on the other side of the upheaval. The modern state is decayed and corrupt, built on top of technologies that have irreversibly shifted, and the world is changing, as it did during the Reformation.

There’s no going back to “normal,” to the days before Trump, and pretending it was all a bad dream. But there’s no going back to the golden age imagined by Trump’s supporters either. We need to invent the future.

Knowing what happened five hundred years ago should make us think about the path we take from here. The biggest question in politics today is: how do we get to where we’re going without another Thirty Years’ War?

ThirtyYearsWarRepeat.png

The Apollo Mirage

The last time anyone walked on the moon was the day before I was born. I’m a bit bummed at missing the show.

Apollo17Rover.png

And what a show it was. The Soviet Union had taken a surprise early lead in the Space Race, putting Sputnik, Earth’s first artificial satellite, into orbit, and following up with Yuri Gagarin’s first human spaceflight. “Today, for the first time, a man has flown in space,” announced an American newsreader, “and that man is a communist,” he concluded, voice as grave as America’s Cold War second-placing.

Against that dire background, President Kennedy ignited America’s imagination, upping the stakes with a race to the moon, and kicking off the Apollo project.

KennedyMoonQuote.png

Apollo, and the four hundred thousand people who worked on the project, burned through twenty-five billion dollars, back when a billion was a lot, and achieved Kennedy’s goal. “We came in peace, for all mankind,” America announced, happy to have spectacularly overpowered the Russians.

The United States was the richest, most powerful, most scientifically advanced entity that had ever existed.

It’s no wonder that in moments of malaise, when we feel that something big needs to be done, that people call for a new Apollo project. It might be a new Apollo project for clean energy, or infrastructure renewal, or, for those with more stunted lateral thinking, flying big rockets to somewhere.

A visionary politician, a starry-eyed speech with a decade long commitment, and a shit-ton of government money and we can do the impossible, unite the people, and revitalise the nation.

But any new Apollo project is a mirage.

Apollo was a child of the Cold War and conditions that no longer exist. The Cold War was a spending competition. Superpowers, by definition, were those countries that controlled the most firepower. The signifiers of superpower, aircraft carriers, stealth bombers, and intercontinental ballistic missiles, all carry unbelievable explosive force and are all ferociously expensive.

Ever since gunpowder arrived on the scene at the end of the fifteenth century, blasting through protective city walls, the key to political power has been the ability to purchase firepower. Gunpowder gave way to high explosives and then to nuclear weapons. States became empires. Empires became superpowers.

Global politics became a winner-takes-all game of who could buy the most explosives. At the end of World War II, we were down to the final round: the United States of America versus the Soviet Union, for the superpower championship of the world.

The key question of the Cold War was: who can buy the most bombs? Was it going to be the Soviet government with total control of the economy of the largest country in the world? Or was it going to be the American government, taxing the profits of a freer economy?

The point of the Apollo project had very little to do with science, and even less to do with “peace for all mankind.” It had everything do with upping the stakes in a massive spending competition. The point of the Apollo project was to be expensive for the sake of being expensive, in a way that demonstrated to the world America’s ability to spend extravagantly on firepower.

The point was made. It’s more effective to tax half the wealth of a relatively free economy than to take the entire output of a planned economy. Ultimately the Soviet Union couldn’t afford to keep up with the aircraft carriers and stealth bombers and Star Wars lasers and the empire collapsed, leaving the United States as the dominant military force on the planet.

There will never be a “new Apollo project” because the conditions that made the Apollo project no longer exist. The logic that underpinned Apollo was the logic that “he with the most firepower wins.” But at the same time that that logic was reaching it’s peak, with the moon shot and the nuclear arms race, the technology that supported that logic was changing.

The centuries-long rule of increasing firepower has exhausted itself. The two superpowers created weapons that are literally too dangerous to use. At the same time, both superpowers have been fought to a standstill in the mountains of Afghanistan and elsewhere by enemies using cheap improvised weapons. Technology increasingly favours the defender.

Apollo was a success because it matched the political reality of its day. Any “new Apollo project” would achieve all of the expense and none of the benefits. Anyone trying to sell you a “new Apollo project” is probably looking to grab the power that comes with controlling that much spending and isn’t too bothered about achieving any of the supposed benefits.

Censal Ineptitude

Last year’s New Zealand census was “bungled,” “costly,” and “chaotic,” “a digital-first experiment gone wrong,” according to a Newsroom story yesterday.

I didn’t even know the census was on, between no longer watching the broadcast TV news nor getting the newspaper and having an ad-blocker in my web browser. It used to be easy for the government to get its “public service” messages out, but no longer. One of the many ways centralised control is failing in a fractured mediascape.

Statistics New Zealand’s failure to successfully take stock has bothered some but failure to take a census isn’t all bad. Sometimes taking a census is downright sinful.

KingDavidCensus.png

King David’s pride caused him to perform a census of his people and land. As punishment, God gave King David the choice of famine, war, or plague. David chose the plague and 70,000 people died. Better adjust that headcount downwards. There’s a bullet that Stats NZ has dodged for us.

Twentieth-century censuses are better documented, as evidenced by the 62-page notes section at the back of IBM and the Holocaust. Having accused one of America’s largest corporations of complicity in one of modern history’s greatest crimes, author Edwin Black remains surprisingly un-sued.

Black details how IBM and its German subsidiary helped the Nazi government tabulate census data in order to round up Jews for transportation to concentration camps. The part of the book that most sticks in my mind is what happened during the occupation of France.

FrancePunchCards.png

Germany is stereotypically efficient and is the sort of place you can imagine being able to run a census. France is more of a knock-off-at-four-thirty-for-drinks type of place.

If you were a European Jew in 1940, this was a critical difference. In places with detailed, organised census records, the Holocaust cost the lives of 90% of the Jewish population. In France it was more like 25%. The shambles of French official statistics saved countless thousands of Jewish lives.

The one French official who seemed to know how to operate the tabulating machinery, René Carmille, turned out to be working for the Resistance. After a two-year attempt to run a census, his office had neglected to collect the crucial Jewish identity data. It had, however, managed to put together a secret register of 300,000 World War I veterans ready to  mobilise for liberation after D-Day.

Capable people are dangerous – you’d better hope they’re on your side. Ineptitude is often the best we can hope for.

Sometimes I worry about all the data that Five Eyes hoovers up from social media and phone records but I’m comforted by the idea that they probably have no idea what they’ve got and, if they did, no idea what to do with it.

No doubt, many people and organisations find census data to be a useful tool and will be upset by the shambolic 2018 census. I am more sanguine. The benefits from a census are small and known. But the costs when things go bad can be extreme. Collecting vast troves of personal data is the sort of activity investors call “picking up pennies in front of a steamroller.”

TalebTurkeyProblem.png

Trading small, known benefits for unlikely but catastrophic loses eventually leads to ruin. I, for one, cheer on the ineptitude.