Facebook Blogging

Edward Hugh has a lively and enjoyable Facebook community where he publishes frequent breaking news economics links and short updates. If you would like to receive these updates on a regular basis and join the debate please invite Edward as a friend by clicking the Facebook link at the top of the right sidebar.

Friday, February 21, 2003

News Monster Causes a Stir

Ben Hammersley notes that his comments section on NewsMonster is worth checking out. I've been using it for a couple of days now, and it meets my needs just fine. Still, I suppose I'm not a particularly demanding user, nor am I a purist, so perhaps I'm not among the best placed to recommend to others. I'm sure the debate about the robot exclusion standard is important, but it's a bit beyond me. On the other hand, not having an automatic update does seem to be a substantial drawback, I don't know if they've got a fix for this in the pipeline. One of the posts made, what was for me a particularly telling point about those who don't enjoy the luxury of a broad band connection (or only have one at work where they don't have time (?) to take advantage of it). I acquired a pocket PC with the intention of chewing up lost travel time with news updates, and then I found blogging. Now I use my 'prime time' travel to plough through some of the highly appealing books that are piling up on my 'to read' shelf. Who the hell ever said the old was incompatible with the new, the trick is to find the way to put them together.

I haven't tried NewsMonster yet, but based on the discussion, it appears that the functionality that it most closely resembles is the "Offline Web Pages" feature of Internet Explorer for Windows. It also would appear that most people contributing to this discussion have not used this feature before, and therefore don't appreciate just how valuable it is. If you haven't used it, here's a quick overview:

Offline Web Pages drives Internet Explorer just as if a live user were driving it. It stores complete web pages and all linked images and other content elements in IE's regular cache. It's completely user configurable: it can store complete sites or just single pages depending on the URL; it can recursively dive down up to 3 (I think) levels deep; it can follow links to "outside" sites or stay within the domain specified by the initial URL; it can run on a schedule, on system events like startup or shutdown, or on demand; it can traverse and cache a single site, or a whole list of sites.From the user's perspective, you just run IE, put it into offline mode, then browse the site(s) as you would normally. There's no difference between that and browsing the site online, except that the offline experience is blazingly fast, much faster than browsing online even over DSL or other broadband. The way I used to use this feature was as follows: I have a half-hour train ride to and from work every day. I had my laptop set to download a list of sites every weekday morning at 5 a.m. and again in the afternoon at 4 p.m. The sites included CNET, NYT-Tech, Wired, GMSV and a few others. I could then read the news on the train using my laptop with IE in offline mode. This was a tremendous time-saver for me. I've since switched to using a Pocket PC for the train ride, but I still use Offline Web Pages for a few sites that I look at in the evenings at home. Remember that the vast majority of web users still are stuck with 56K dialup, and will be for years to come. Using Offine Web Pages vastly improves the experience of browsing the web in that environment, as well as extending the availability of the web into situations where it isn't currently accessable. Are Offline Web Pages inefficient from a server perspective? Certainly. Nevertheless, the feature is invaluable under certain circumstances.
Source: Ben Hammersley.Com

Lessig Makes the Case on Software Patents

The issue of patent protection for software continues to go the rounds. This time it's the turn of the European Parliament to take a view. Lessig eloquently makes the case that patent and copyright law in relation to software is a mess, but Europe is the land of regulation (not all of it bad by any means) and conservative traditions in Europe are very different from their American counterparts (with the notable exception of the UK the idea of individual liberty and privacy bears little resemblance to the US one: something which should have been made clear by the attitude of French law to material posted on Yahoo and e-Bay) so I'm not especially optimistic. With many of Europe's major companies having lost more money than they want to think about, and with governments who don't understand the internet having been given the excuse they needed by 11/09 to focus on the 'security' problem, and the control, rather than the 'freeing' of information public policy in Europe seems to be hovering dangerously in the region of an all time low. After a lot of trumpeting about the coming benefits of the 'information age', no-one in Brussels it seems is particularly concerned about the absence of a 'critical mass' of basic material to search in any language other than English (if you don't believe me try surfing the web in eg French, or Spanish, or Italian sometime). Europe, as I said, is the land of public policy and regulation, and own-language content is an issue which is just crying out for subsidies if ever there was one. The French government makes no secret of its preoccupations about the future of the French language, but isn't it time to 'put your money where your mouth is'. The answer, unfortunately, seems to be no, as the main talking point these days seems to be how to find a way to charge for content. I'm sorry Larry, I fear your reasoned plea is destined to fall on deaf ears.

As pressure mounts on the European Parliament to extend patent protection to software, a crisis is developing in US patent law that Europe would do well to consider. The system in America is broken - to the great detriment of software developers generally - and there is no reason to believe the Europeans could do any better.
The claim that the US patent system is in crisis is nothing new. What is new is the identity of those making it.........throughout the administration of President Clinton the Patent Office insisted that the system worked just fine. Patents were being granted for truly novel inventions only, the office said; and innovators had no trouble in identifying who owned what invention. Claims that the system was in crisis were little more than the ravings of Chicken Little. The system would work itself out. It always had.

Yet now the Patent Office is singing a different tune. As its new head, former Republican Congressman James E. Rogan, said in an interview with the L.A. Times on February 7, 2003: "This is an agency in crisis and it's going to get worse. It doesn't do me any good to pretend there's not a problem when there is." The reason is the mess created by the last administration's patent office, especially in the context of business method patents (the type of patent, for example, that gives Amazon an exclusive right to its "one-click" method for selling merchandise online). "Some of [these] were fairly broad," Mr Rogan told L.A. Times reporter David Streitfeld. "We've gone from a 75 per cent acceptance rate to a 75 per cent rejection rate." This early and easy acceptance rate led to an explosion in patent applications and patents granted - and, in turn, in the costs that software developers face. "Developing software is [now] like crossing a minefield," says Richard Stallman, the originator of the free software movement that has developed the GNU/Linux operating system. "With each design decision, you might step on a patent that will blow up your project."

This is the most surprising fact about software patents: they are generally opposed most strongly by the people they are intended to benefit. But such opposition is not difficult for a conservative like Mr Rogan to understand. Patents are a form of regulation. They represent a government decision on who gets a monopoly over what invention. Republicans like to claim that Democrats regulate first and ask questions later. They are therefore more eager to ask the right questions up-front. Yet the questions here have no good answers. Like any form of government regulation, patents make sense only if their benefits outweigh their costs. The public benefit from patents is presumably the inventions that otherwise would not have been made. The costs include the extraordinary burden of knowing just what innovation is and is not subject to a government monopoly. These costs are borne both by innovators seeking a patent and by those just writing code. Both must wade through incomprehensible claims about who owns what inventions to avoid the inevitable hold-up if their code proves successful.

Software developers are quite aware of these costs. Yet economists have found it very hard to reckon any net benefits. And thus conservatives are increasingly sceptical of this form of regulation. No doubt it has produced "a whole cottage industry of shysters," as Mr Rogan admits. It is harder to show that patents have produced any gain that would justify their costs.The issue is not just a problem of implementation. The weakness runs much deeper. It may well be that software development requires some form of government protection. It does not follow, however, that patents are the protection that software needs. Software already receives the protection of copyright and trade secret. (The "code" of software is a kind of writing that copyright protects; and the properly hidden secrets that stand behind software can be protected like any other business secret.) These too have their critics: the term of software copyright is effectively perpetual; and trade secrets tend to hide, not spread, knowledge. But if these forms of protection are inadequate or misinformed, then the solution is to find a form that better fits software. No one really believes that patents are well designed for this type of invention. Yet no government has adequately explored the alternatives.
Source: Financial Times

No comments: