April 25, 2024

Trolledbo

Changed Your Life

9 dark secrets of the federated web

9 spooky secrets of the federated web

Robert Frost once wrote that good fences make good neighbors. Today, many developers feel the same way about the internet—longing for a world where websites and their servers each live in separate spaces, free from entanglement. Aside from the oligarchs, just about everyone likes the idea of a federated web.

The term federated alludes to federalism, the philosophy that guides the political structure of the United States. Each of the states retains sovereignty, and the entire country benefits from that independence. The internet works similarly. As an ideal, it offers a combination of resilience, flexibility, and distributed power that burns brightly for those who value freedom. In reality, the web today is a mix of independent islands and tightly integrated silos. There are many examples of sites that work together at arm’s length, embodying federated web design. There are also walled gardens, where a central administrator dominates all interactions, embodying control as a modus operandi.

For all the perceived advantages of a world populated by independent fiefdoms and principalities, the federated web has its drawbacks. In the interest of understanding, let us consider some of the dark secrets of the federated web—hidden problems that few of us like to look at. These issues may not be reason enough to abandon the vision, but they can help us develop more balanced technical solutions.

No economies of scale

Many mergers and rollups are driven by economies of scale. Hundreds or thousands of independent websites mean hundreds or thousands of databases filled with accounts, logs, and other overhead. Each needs a separate systems administrator, database administrator, or devops team. When the numbers start to reach into the millions or billions, the economic pressure to pull everything under one roof is powerful.

Open source platforms like Drupal or WordPress offer a solution, allowing individual sites to maintain their independence while handing off much of the development complexity and overhead to a larger system.

More logging

When two or more sites in the federated web want to collaborate, they start by checking authorizations, which they do by swapping packets of data. All this information adds to the bandwidth charges—and the cost of storing the logs. While data storage is cheap, and bandwidth costs aren’t bad for small packets, the relentless stream of authorizations and coordination quickly adds up.

Some developers want to go one step further and use technology like the blockchain to track an endless stream of transactions and events. The work of collecting these events and blessing them with the blockchain’s assurance means even more overhead, especially if the computationally burdensome proof of work consensus algorithm is used. Even lighter-weight algorithms like proof of stake or a managed blockchain add to the burden of record keeping.

Digital signatures everywhere

The science of cryptology has given us many good algorithms for creating digital signatures that can certify every interaction in the federated web. The mathematics is powerful and while it is not bulletproof or perfect, it can significantly improve the authenticity of data packets.

The good news for the federated web is that some organizations are starting to deploy these same security measures in their internal networks. Even though the databases and servers are all run by the same enterprise, many security professionals are embracing a zero-trust architecture, which insists that each machine interrogates every packet.

Caching is complicated

Much of the speed on the internet relies on smart caching policies. There’s a drawback for federated architectures, though, which can run into legal and practical hassles with caching. A friend spent months redoing the checkout system for an online store where he worked. Credit card processors had rules against caching, which caused some of his biggest performance problems.

Federated sites may be willing to share information one time, but they may also have strict rules about how much data you can retain from the interaction. Perhaps they’re worried about security, or they could be worried you’ll cache enough data that you won’t need them anymore. In any case, caching is often a hassle with federated sites.

Forgotten security holes

One way that sites try to simplify federated relationships is to store authorizations and keep them working for months or years. On one hand, users like saving the time it takes to reauthorize. On the other hand, they often forget that they’ve authorized some distant server, which can become a security hole. There’s no simple solution. Asking users to authorize too often is annoying and time-consuming. But not asking often enough leaves security holes. Some sites send a message every few months, asking users to review their authorized connections. That’s just a soft way of making them reauthorize.

Cascading security failures

Ideally, a federated architecture should be resilient, particularly against security failures. But systems sometimes end up affecting each other, so that a problem with one can bring them all down. If multiple sites in a federation depend on one partner for, say, authorization or identification, then this partner becomes a potential weak link. It’s not uncommon for a failure in one site to lead to a cascade of security failures.

Vulnerable dependencies

If you ever want to scare a Java developer, mention the open source logging framework, Log4j. When a security vulnerability was discovered in the framework, which is used in almost every Java application, developers around the world scrambled to patch holes they didn’t know existed. Developers need to trust that their libraries are secure, and yet there is no way to certify code safety without testing every line of code.

The federated web brings a similar kind of danger. Your code might be clean, but what do you know about other websites you partner with—or their partners? Federated web idealists imagine a vast, rich collection of interconnected sites that can be as public or as anonymous as they need to be. The challenge is creating real accountability within that system. No one wants their code vetted by an unaccountable team, and the same is true for websites in a federated web. 

Monoliths rule anyway

Monolithic corporations like Amazon and eBay are actually constellations of millions of smaller companies. While they may appear to users as one giant system, there’s often quite a bit of federation inside. The difference is in the concentration of power. The central company makes the decisions, and the smaller companies do as they’re told.

The conundrum is that all the work required to maintain a federated web must be done, and the entity that does it inevitably holds centralized power. The system evolves toward central control, no matter how much architects try to engineer around it.

Too much complexity

At the end of the day, people—both users and engineers—struggle with complexity. A simple example of how users undermine the federated web is by reusing passwords. People can’t remember hundreds of different passwords, and so they use the same one again and again. In theory, each site should maintain an independent security layer, but in reality, users can’t handle that much complexity. So, they’re constantly undermining the security of the federated web.

Competition and freedom to choose are wonderful options, responsible for much of the diversity that makes the internet irresistible. But managing true federalism brings a level of complexity that is often more than real people—and the real systems we build—can manage.

Copyright © 2022 IDG Communications, Inc.