One important thing about a website, any website, is that you should be able to tell where users click and how often. The concept is the same as the electronic price scanners in the supermarket.
If people in the grocery store suddenly start buying a certain brand of coffee, the manager needs to order more coffee to fulfill the demand. Price scanners allow him to do that, quickly, easily and efficiently. Often the manager doesn't even need to do anything more than print out the report from the computer.
The same thing goes for websites. Webmasters need to know what people are looking at so he can make his site better. If people are clicking on certain areas of the site, the webmaster will probably want to put more content on the site that people are interested in. Furthermore, when Google's web crawlers come to your website, you'll want them to gather the right information so that you can be ranked higher in their databases. The better your ranking on the search engines, the more people come to your site.
So, why would you spend a lot of time building a website, paying for server space and bandwidth but set up a website which does not allow you to know whether your money is well spent?
I'm not talking about "Big Brother" kind of data gathering. I'm talking about aggregate data. For example, you might collect statistics on how many people click on an icon of a digital camera versus the number of clicks on a vintage Rolleiflex. If 50 people click on the digital camera but 250 people click on the Rolleiflex, you would probably like to know that. Wouldn't you? And, knowing that, wouldn't you be thinking about redesigning your website to favor the things that your visitors want to see?
With Flash, it's not as easy to do that. That is, among all the other reasons, why Flash is bad...
Thanks for the link. Can't figure out how I missed it, as I am frequently on the REI site.
Originally Posted by kswatapug;1158777As a tool for planning, I've long been fond of the noaa.gov's website, especially the tabular data that includes projected hourly cloud cover, precipitation, temperature, etc.[URL="http://weatherspark.com/"
Not to "rain" on your parade, but I would like to offer a (non-flash) perspective on this site. Back in the eighties, prior to the world wide web, I myself offered value added computer serviced weather forecasting by modem and fax , before cashing in the chips. So, I have some experience in value-added meteorology. It is important that the public at large realize that all weather information in the U.S. is generated from NOAA and European government data. Extrapolation of specific variables beyond 48 hours is based on increasingly error prone numerical model interpretation. Like you, I use the NWS tabular data to get an idea of general weather conditions and forecast for specific remote locations. While I note that WeatherSpark is beta – and its home page shows it, I find the graphical representation into the future to be subtly misleading in the same way that AccuWeather 10 day and longer are. The degree of specificity and "dressing up" of such data by third parties can inadvertently lead one to view such forecasts as more reliable and factual than they actually are.
While I will check back to see what other services the site offers, I caution users of this (and other) value added sites to view the information with some degree of caution.
I am a weather geek so I will add it to my list of various sites like NWS, Storm Pulse, SPC, Unisys for upper air charts, and SkewTs and Space Weather for extraterrestrial weather.
"Fundamentally I think we need to rediscover a non-ironic world"
Wow great, works for the UK too!