Computing

Sunday in the park

I spent a portion of Sunday afternoon, between working on some php projects that need to be finished, watching the ducks and Canadian Geese on the pond at the lake. For some time I pondered Dave Winer's comments on Saturday about Atom and wondered why not produce a feed normalizer that converts Atom to RSS. Obviously I wasn't the only one. The thing is, Dave is right. We have a good standard already (Thanks Netscape and Userland). If it's too ambiguous in areas for some then lets extend the current spec in a way that does not break it.

What to tell Microsoft

Dave Winer asks for suggestions of features that he should bring up with the Microsoft IE team that would make IE better for blogging. My request is more one for Dave, something I'm pretty sure he would do anyway, but please publish the list of suggestions. There are lots of other great browsers out there and I'm sure there are folks who will pick up on the tips and help make all kinds of blogging interfaces (and not just web browsers) better.

You don't exist, go away

IBM Developerworks has a good article about error checking and how programmers and designers need to be careful to ensure that the checks they make are checking for not only probable answers or situations but also possible. Imagine a web form asking for the name of the President of the US when you were born. The form asks for the middle name, but denies input when you enter "S" as President Truman's middle name.

There is comment that the author gives companies their own customer comment line when they ask for a phone number. I've been known to use the same technique for email addresses.

Search engines vs. directories

With the increasing power and reach of search engines there is an ongoing debate about the benefits of search engines and human edited directories. The big search engines are locked in battles with web developers who are in search of the all-powerful first place in search results. If one does a Google search on "horse AIM icons" for example the first several results that are returned are all advertising for various drugs.

So here's an idea. Why doesn't google implement an API interface that could allow web users to indicate that the page results are deceptive. By requiring people to sign up for an API Id and limiting the number of sites that can be reported in a day they could help reduce malicious use of the service. Perhaps there is ultimately a human that reviews the top offenders and does a blacklist of sorts to remove them from the top of the results.

Pages

Subscribe to Computing