Seeing Broadband Like a State

A commenter on my post on Why Large corporations? asks

Any thoughts on subscription tv and/or broadband service? There are obviously economies of scale for the providers. But, how does that impact consumers?

I think it is relatively easy to tell a story in which the state prefers less competition. For one thing, they can tax the heck of out of broadband service. For another, the process of awarding cable franchises fattened many a local politician’s wallet, because the rent-seeking behavior was so heavy.

Technically, I believe it is possible to imagine a world in which spectrum is not allocated to specific uses. Instead, devices are designed to share spectrum, automatically roaming to find the best available band on which to communicate. This approach would take the FCC totally out of the picture. It might allow wireless broadband Internet access without people having to sign up for service from a big telecom company. Not surprisingly, the FCC is not pushing this. From a political perspective, the status quo is much preferable.

Look, maybe I am paranoid. I picked up on the Open Spectrum idea years ago from lefty-techies, and as far as I know they have not backed away from their view that this approach to spectrum usage is feasible. If they are correct, then broadband Internet might very well be a poster child for an industry where large corporations emerge because of government policy rather than any inherent efficiency.

4 thoughts on “Seeing Broadband Like a State

  1. It is impractical to have totally spectrum-agnostic wireless systems.

    For one thing, the hardware would need to be large and very power-hungry — we don’t have technology that supports tuning from a few megahertz up to multiple gigaghertz in one economical unit.

    Antennas would also be complicated; while there has been good progress in making antennas that work for most of the current mobile-phone bands, going much lower in frequency would require much larger antennas. A lot of people probably remember the antennas that CB and ham radio operators use; they were large because an ideal antenna scales with the wavelength (inverse of frequency).

    Also, it would be awkward to use: If my mobile phone needed to scan multiple gigahertz of frequency to find one of the relatively small slices that my service provider uses, then (especially if someone else is transmitting on the same bands) it could take minutes or hours before it got service. By dedicating slices of spectrum to protocols, it becomes practical to turn my phone off from time to time. That problem becomes much worse for first responders and emergency services — not only are their time-to-link requirements shorter, they probably also need to use lower frequencies so that they can operate more than a few miles from their base station.

    We could probably invent a better system than we have today, but I think it would be incrementally better rather than radically better.

  2. It seems to me that functional mesh networks would (could) obviate the need for established carriers. This might even be done without needing to go beyond currently unlicensed spectrum and power requirements, given the density of devices which we find at least in urban environments. With the right protocols in place for existing and new devices, one could replace existing carriers probably within a decade.

    I would bring a startup looking to pursue that onto Blueseed in a heartbeat 😉

  3. Mesh networks promise a lot but generally deliver little unless they have usage and management homogeneity. They work well enough for applications like latency-tolerant sensor networks, but rather poorly for general data usage, and horribly for things like interactive voice or video. General-purpose wireless mesh research is good for burning up government grants and fueling Ph.D. theses, but probably won’t change that situation much.

    Just think: If there are 20 nodes on the path between you and me, our packets take up 10 to 20 times as much spectrum (depending on how you count spatial diversity) as if our packets went over hard lines between our respective local nodes. Avoiding spectral hot spots becomes a huge routing problem, and routing problems are inherently hard because there isn’t a central policy arbiter; the expected autonomous system (AS, in routing parlance) size approaches one node.

    When any of the nodes or links go down — which would happen with some regularity because the links are wireless and the nodes are seldom designed as high-reliability infrastructure — the congestion and latency issues get worse. You can mitigate the reliability/latency problems through channel coding, but even with state-of-the-art fountain codes, you’ll still inflate spectrum usage by some additional percentage. (If your coded message size N and the number of independent paths between endpoints K, then when N > K, the achievable overhead scales with 1/K — you need an oracle to keep your expected packet loss rate below 1/K. If your message is small enough that you do have that many independent paths, the fountain code’s constant overhead is probably a significant percentage of the total.)

  4. Wireless networks are much slower, and are going to be so for the foreseeable future. Thus, I don’t see how any form of wireless networking can replace broadband. Fiber is fast.

    That said, I don’t know how the regs work. Yes, it’s expensive to pull wires all over creation. However, you’d think there would be more than *two* companies giving a go at it–phone and cable. Odd to think of a service so very new, already being run like a utility.

Comments are closed.