My Network World column and outside links on network neutrality
Oops! It turns out Network World ran my column on network neutrality and Tariff Rebate Passthrough on April 23, not April 30 as I previously believed. So I should have gotten my list of outside links together sooner. Sorry. Confusing matters further, my post on Jeffersonet vs. Edisonet got Slashdotted, without me having provided a link to the column itself. Well, here goes.
- Ed Whitacre, CEO of SBC/AT&T, kicked things off in late 2005, arguing that he should be allowed to discriminate between, say, Google and Yahoo based on how much they paid him. He later backed down a bit, but many people are unconvinced as to his sincerity.
- Verizon said similar things around the same time.
- Om Malik made an early attempt to cover all sides of the issue, albeit with a pro-neutrality orientation.
- Jeremy Penston had an article last week arguing that bandwidth is NOT “effectively free,” at least for video.
- Scott Cleland says my thinking is “seriously flawed” because I think there’s any need for neutrality regulation. He thinks no logical case has been made for neutrality whatsoever. Perhaps we need need to spell it out for him in smaller words. Anyhow, his one-pager that is admittedly sponsored by the telecom vendors does a good job of smashing some net neutrality strawmen nobody was talking about anyway.
- Errata Security argues that network neutrality is unworkable on the backbone, and links to my article, which was only about network neutrality on the last mile.
- Richi Jennings argues that we don’t need tiering, because the speed of light has dropped from 300 million meters/second to a mere 80 million or so. Or maybe less. Or something like that … seriously, it seems that he doubts the value of super-high QOS, because latency is an inescapable fact of life.
Categories: Net neutrality, Public policy and privacy | Leave a Comment |
Link list for network neutrality
My April 30 Network World column is scheduled to be on network neutrality, with this post linked out as a guide to further research.
Some of my own writings on the subject include:
- This post today separating the Internet into “Jeffersonet” and “Edisonet”, where Jeffersonet needs extreme net neutrality but Edisonet can and must endure tiered pricing. I’d have loved to get that point into the column, but there wasn’t room.
- This post today calling for extreme net neutrality specifically in the area of search.
- Two posts last June (with links to additional prior ones) spelling out the Tariff Rebate Passthrough idea. These cover mainly the same material as the column, but are part of group incorporating to some discussion of the idea last spring among a variety of commentators.
I’ll supply some outside links on the subject later on.
Categories: Net neutrality, Public policy and privacy | 1 Comment |
The two Internets, Jeffersonet and Edisonet, and why they need to be regulated differently
Edit: This post was Slashdotted, along with Richi Jennings’ reply.
In a way, proponents and opponents of network neutrality are both correct! That is, they are each correct about different aspects of the Internet.
Net neutrality is both necessary and workable for what I call Jeffersonet, which comprises the “classical”, bandwidth-light parts of the Internet. Thus, it includes e-mail, instant messaging, much e-commerce, and just about every website created in the first 13 or so years of the Web. Jeffersonet is the greatest tool in human history to communicate research, teaching, news, and political ideas, or to let tiny businesses compete worldwide. Any censorship of Jeffersonet – even if just of the self-interested large-enterprise commercial kind – would be a terrible loss. Net neutrality is workable for Jeffersonet because – well, because it’s already working just fine. Jeffersonet doesn’t need anything beyond current levels of bandwidth and reliability. So there’s no reason to mess with what’s working, other than simple profit-hungry greed.
Read more
Categories: Net neutrality, Public policy and privacy | 14 Comments |
Check Point Systems UTM-1 and Crossbeam Systems – resolving the confusion
When Check Point Systems first briefed me on their new midrange UTM-1 appliance, they neglected to mention that their hardware designs were first worked out by Crossbeam Systems. Actually, it turns out that they even buy the hardware through Crossbeam. It took a comment here from Crossbeam’s Chris Hoff for me to realize the true story. Today, I connected with Paul Kaspian of Check Point to straighten things out. Here’s the scoop. Read more
Categories: Check Point Software, Computing appliances, Crossbeam Systems, Hardware, Platforms, Security and anti-spam | 3 Comments |
Business intelligence — technology and vendor strategy
The most recent Monash Letter – exclusively for Monash Advantage members — spells out some ideas on BI technology and vendor strategy. Specifically, it argues that there are at least four major ways to think about BI and other decision support technologies, namely as:
- A specialized application development technology. That’s what BI is, after all. Selling app dev runtimes isn’t a bad business. Selling analytic apps hasn’t gone so well, however.
- An infrastructure upgrade. That’s what the BI vendors have been pushing for some years, as they try to win enterprise vendor-consolidation decisions. To a first approximation, it’s been a good move for them, but it also has helped defocus them from other things they need to be doing.
- A transparent window on information. As Google, Bloomberg, and Lexis/Westlaw all demonstrate, users want access to “all” the possible information. BI vendors and management theorists alike have erred hugely in crippling enterprise dashboards via dogmas such as “balanced scorecards” and “seven plus-or-minus two.”
- A communication and collaboration tool. Communication/collaboration is as big a benefit of reporting as the numbers themselves are. I learned this in the 1980s, and it’s never changed. But BI vendors have whiffed repeatedly at enhancing this benefit.
The Letter then goes on to suggest two areas of technical need and opportunity in BI, which may be summarized as:
- “Play very nicely with portals.”
- “Do a much better job of managing personal metrics customization.”
Good launching points for my other research on these subjects are this recent post on analytic technology marketing strategies and two high-concept white papers available here.
Categories: Analytic technologies, Business intelligence, Usability and UI | 4 Comments |
When and why to virtualize
In one of the best Slashdot threads I’ve seen in ages, a number of posters chime in with their personal experiences of virtualization. (Usage hint: Set the general threshold = 5 to filter out the dreck, using Advanced Context Controls.) The rough consensus appears to be:
- Virtualization has overhead, but probably a lot less than the 43-50% sometimes claimed.
- Just to be safe, don’t virtualize apps that are already I/O-bound or otherwise running flat-out. (So there’s no contradiction to my support for dedicated security, networking, and data warehouse appliances.)
- Big enterprises have lots of production servers that are old, unreliable, and/or idle most of the time. Virtualize those.
- If a server’s use is particularly spiky, it may be a great candidate for virtualization.
- Most development servers can and should be virtualized.
Makes sense to me.
Categories: Computing appliances, Platforms, Virtualization | Leave a Comment |
More on Shai Agassi and SAP
Sramana Mitra has a little bit of a different take on Shai Agassi’s departure than mine. At first blush, it’s a distinction almost without a difference. In essence, she argues that Shai was frustrated because he couldn’t make big needed changes fast enough. That’s pretty close to my view that change simply wasn’t happening quickly or completely enough.
But the thing is — I think SAP’s overall technology roadmap has remained too incomplete. In essence — and I know some of my friends there will dispute this — SAP is still too focused on delivering software for how people should work, and doesn’t properly support the way they actually do — or realistically would like to — work.
Yes, it’s great that Dennis Moore and Dan Rosenberg are at SAP. But nobody — and this includes Shai — seems to be driving a real software re-think down into the individual products. The move to portal-based technology needs to be the beginning of the software functionality redesign, not the end. Josh Greenbaum thinks that Duet is all that and more, but I don’t see it that way.
Categories: Enterprise applications, SAP | 6 Comments |
Shai Agassi – a contrarian view
Shai Agassi is leaving SAP because, in essence, the old guard didn’t want to turn over the reins to him as fast as he would have liked.* Often, this kind of departure is a bad thing (e.g., Ray Lane at Oracle). But I suspect that SAP may actually be improved by Shai’s leaving.
*His other stated reasons include two very good and highly admirable ones – working on energy technologies and improving matters in Israel.
SAP’s technical strategy has three core elements:
- Automate business processes.
- Provide the technical infrastructure for automating business processes.
- Encapsulate process and data at the object/process level.
This strategy has been heavily developed and refined on Shai’s watch, with major contributions from lots of other folks. The issue isn’t vision any more. What SAP needs to do better is execute on the vision.
Categories: Analytic technologies, Enterprise applications, SAP | 3 Comments |
Great news for Openwave
Dave Peterschmidt is out as CEO of Openwave, and this is a very good thing. Even better, the company is being shopped. Best news: Jerry Held is on the committee doing the shopping. Not that I agree with Jerry on everything, but on the whole he’s pretty astute.
Openwave will probably find a buyer at a decent price. Dave’s bad, but he doesn’t completely destroy companies; there should still be some value there.
Categories: Enterprise applications, Openwave | 2 Comments |
Three ways to market analytics-related technology
“Decision support”, “information centers”, “business intelligence”, “analytic technology”, and “information services” have been around, in one form or other, for 35+ years. For most of that time, there have been two fundamental ways to sell, market, and position them:
- Access to information
- Application software
More recently – especially the past five years – there’s been a third way:
- Infrastructure upgrade
as early-generation implementations get replaced by newer ones.
At the 50,000 foot level, here’s some of what I see going on:
- Classical BI marketing is floundering. BI vendors don’t know whether they’re in the business of quick/easy information access, analytic apps, or better-enterprise-system-software.
- A few areas of analytic application are being packaged and marketed well, with solid business-process stories and good customer acceptance of same. The biggies are budgeting/planning and CRM analytics. On the whole, however, analytic apps are floundering, or else are little more than reporting front-ends on operational systems (e.g., in network management).
- Data warehouse software startups are on a roll. Especially at the high end, this is a pure infrastructure-upgrade business. There’s plenty of room still for improvement, but multiple vendors each are doing good jobs of marketing on the basis of:
- Speeds and feeds
- Ease of deployment
- Ease of administration
- Price
- Credibility
- Data integration is mainly an infrastructure improvement play. After all, that integration COULD be hand-coded. Automating the process is usually a better-infrastructure story.
- Text search is still an information-access story. There are multiple niches where search is booming. But in all cases the story is information access. Evidently the technology and/or market aren’t mature enough yet for strong infrastructure stories. And in the limited cases where text search gets integrated into general application software packages, it’s usually just for information access rather than a real business process.
- Data mining and predictive analytics are mainly information access plays. Yes, the information being accessed is calculated rather than raw. Yes, I believe that the heart of the data mining market is continuous process improvement. Even so, what users buy from the vendors is usually little more than information toolkits.
- Text analytics is mainly an information access play. Text mining and information extraction have two main uses right now. Either they resemble – and indeed often feed into — data mining, or else they are used to enhance search and search-like document access.
- Information services have always been an information access play. When you think about it, the financial-quote-machine business is a huge part of the whole decision support market. Lexis/Nexis is no slouch either. And they’re all about providing information access.
Related links
- This three-headed taxonomy of strategies is similar to one I previously postulated for Microsoft, SAP, and IBMOracle.
- I covered analytic business processes at length in a November, 2004 white paper. Unfortunately, industry progress since then has been relatively slow.
- I’ve written voluminously about data warehouse software startups on DBMS2.
- One example of infrastructure focus is the ease-of-deployment trend.
- Web search and generic enterprise search aren’t the only search areas to focus on information access. (And yes, they’re most definitely separate areas.) Even customer-facing structured search does; the information is just tailored according to different criteria. 😉
Categories: Analytic technologies, Business intelligence, Data mining | 5 Comments |