One of the things I do in my capacity as “a person who uses the Internet a lot” (such as for a paycheck) is read, trawl, search the Web.
I see a trend that is not at all remarkable, and one that I believe that I’ve commented on before.
Or not. Whatever.
The Web is growing up: I commented on this recently (March 15, regarding BlogLash – there, I found it….one instance of…).
One result of this maturation is maturation of the ways in which the code that is parsed by servers and sent to your browser is generated.
- Early days: Notepad, vi (!)
- 1996ish: First editors specifically for HTML appear; one still needs to know how to code tables and so on to survive.
- 1997-1999: Major strides in dynamic Web sites have happened. The Evil that is MS Front Page and other (some better, some equally nasty) WYSIWYG “Web-development” tools (actually just HTML tools then) appear. Bring HTML to the common man; geek still needed to tweak those nasty tables and other isues. At the same time, the first divide happens: Backend developers (Perl and Java the main dynamic tools of note) and front-end developers (HTML jockeys, often with programming/SQL skills).
- 1999-2002ish: Programming becomes more and more specialized; backend tools (application servers, databases, datafeeds, search tools) take center stage; the presentation layer is the least complex matter. Scripting languages fall generally into five camps: 1) HTML (static sites), 2) ASP, 3) JSP/Java servlets, 4) PHP, 5) Perl (the old standby…). ColdFusion is a presence, but does well mainly with smaller sites that need dynamic content. Perfect for this use, including intranets (so no one really gets to see them).
- 1999-2002ish Redux: At the same time HTML creation is deprecated, the rise of CSS and DHTML (CSS + JavaScript) takes off, putting a little more emphasis on the front-end development. Tools have not caught up with the technology (so need bodies). Thank god the browsers are finally getting close to similar…now if only all Netscape 4.x users will log off forever…
- Today(ish): While Web Services has been a buzz phrase for some time, some work is actually getting done in this area: Not as much as people expected by this date, but I don’t think anyone can deny that Web Services will be huge in the future. How near this “Web Services future” is and what form the Web Services will take is way up in the air, but the over-arching concept is sound – a COBRA-type system that will allow dissimilar systems to talk to each other and garner data/content from each other without requiring a specialized parsing/access system for each. Think of it as a phone that translates what you say into the language the person on the other end of the line speaks, and converts their responses into your language. Neat.
- Today (redux): Again, there is a schism in what is needed. Backend predominates (databases, SOAP, XML etc), but the scripting languages that actually send the material to the user (or is first parsed by server/application server) don’t yet have the robust tools needed. Actually, the backend tools don’t have the robust tools to send the material to the front-end. Yet this is sorta ignored by most back-end developers, which is why I run accross so many JavaScript errors on my Web travels.
But I babble (so?).
The upshot of all this is that the emphasis, for better or for worse, is once again squarely on the backend. The hell with the presentation layer (except for the look – NOT the feel [usability] ); that’s gravy; that can be fixed later.
To a large extent, this is not a bad way to look at things.
And – as tools get better on the backend – the tools will help the front end. I expect MS to be a big part of this, even though I hate its Front Page, as mentioned above. (Bad code generated! Bad! Bloated!)
OK, those are the (my) facts.
What are the results of this? More to follow…