CSS and Mime Types

Well, I found out my HTML4.01 strict & CSS problem at littleghost.com is not me.

I finally got to the point of understanding the problem — the MIME type for CSS was not properly set. This only creates issues when one uses the strict doctype and attempts to import/link a CSS file. Odd.

And to make it worse, it’s only an issue (display issue) for Netscape. Works fine in IE — but the style sheet does fail the W3C validator, so that’s no good. Should pass (when the same CSS is parked somewhere else [different domain] and called from page on littleghost.com, all is well).

I wrote to Concentric, and they assured me that the type was set. It appears they were just looking at the pages displaying in a browers, and I’m all but certain that the browser was IE (why wouldn’t it be?).

So I set up some example pages for them to look at, and they finally got it. It has something to do with the configuration on my domain. They tested a test html/css file set they had; worked fine and validated at W3C in some other location. They move it into my domain, and they begin to see what I saw.

They are escalating the issue.

While it’s good they finally got it, what if I was a newbie? There first response was “yes it works; the mime type is set”.

How would I have been able to tell them they were wrong? I would have spent weeks coding/tweaking to make it work…and never understanding why it didn’t.

I’m actually pretty good at this stuff, but I had to prove to myself (so I could prove to them) that it was them, not me before they’d try to address the issue. If I didn’t have this Linux box here (so I could kill/add MIME types) I might have been screwed, even knowing what I do.

I felt I needed to do this so I could tell them (as I did) that I have this working in three different environments, two NT and one Linux — but same code fails at Concentric. And what if I didn’t have another domain to park the style sheet on so I could show them that a call to Concentric results in failure, call — from same code — to get the CSS from another domain was successful.

That’s one of the reasons I like to have two domains, but it’s always nice when they are configured correctly. I’m just learning this administration stuff myself, so I use the actual domain hosts I have as examples of what should be done. So I can see if what I’ve done here will work in the “real world.”

Ah well, we’ll see if they get back to me on this one. I’ve been at Concentric (ok, “XO”) for five years now, and don’t really want to move the domain unless I have to. Just too much of a hassle.

About databases…

OK, I was thinking about databases.

What am I thinking about now?

I’ve been coding my brains out lately, but in a very helter-skelter way that (occasionally) dovetails nicely.

The following is a list of what I’ve been working on lately:

  • Littleghost.com: Revamping my littleghost.com Web site for the first time since launch. See earlier entry.
  • HTML 4.01/XHTML: Now that Netscrape is finally standards complient, it’s time to really knuckle down and figure out how to use the tools the W3C has given us over the years that we just could not use effectively. This is a large part of the Littleghost.com redesign. (Note: A recent survey [by who?] said that IE 5-6 have 95% of the market. Fortunately, Netscape’s offerings were strongest with their new offerings, NOT v4.x).
  • Perl & PHP: For various reasons for various tasks, I have been doing a lot of Perl and PHP. I like both a great deal. I have been working with both languages for about 2-3 years, but never really got a lot of time to use them. I’m making time now. I wish my providers supported PHP (one does, but have to code them with a she-bang like Perl scripts and put in the CGI-BIN, which makes them fairly non-portable)
  • Web services: As mentioned in an earlier entry, this is something that I got into because Amazon and Google are opening their APIs to a degree, and use of XML tools make both sites accessible. No linking; no frame-out. Import the raw data and knock yourself out…
  • Javascript/DHTML/CSS: As part of my “standards” search/pursuit, I’ve been doing a lot of this, and making sure it works in IE and Netscape. For the last year or so I’ve been designing for IE solely, and there are still some quirks required to make anything the same in Netscape, even the new versions. So — OK — the true “standards-complient” browsers are not here yet, but they are getting damn close. Thank god the damn LAYER tag is gone….

I’ve been doing a little XML, some Cold Fusion, some stored procs and messing with three different databases (mySQL and Postgres on Linux; MS SQL Server on Win2000), as well. Probably not as much as I should, but there is only so much time.

=======================

One other thing I have been getting into lately is shell scripting. I finally found a book (PDF, on the Web, free) on BASH scripting (I use Bash on my Linux box; to be honest I don’t know if I have the Korn or Bourne shell on there. Doesn’t look like it).

Shell scripts are a pain in the ass, but excellent coding practice. They are difficult because they are so precise. With HTML, you can get away with almost anything (no close TR? the browser understands). With Cold Fusion, you get away with a lot (not case sensative, loosely typed etc). With Perl, it’ll slap you for case, but other matters are handled transparently (variable $num not exist? Then “$myNum = $num + 7” will equal 7. No error).

Shell scripts require all sorts of rule-following, the most difficult — to me — is the space issue:


I like writing: $c = $b + 4;

Shell scripts require no spaces: $c=$b+4;

Yeah, same thing, but …. just not my usual coding practice.

But good — you HAVE to be precise with shell scripts, which is a good thing. (However painful)

In a case of my “learnings” dovetailing, what I’m doing with the shell scripts is writing scripts to back up important files/directories on my Linux box to the Win2000 box and vice versa. This required the following tools/skills:

  • Shell scripts to do all the work, which includes FTP get/puts and so on
  • Installation/administration of an FTP server on my Win2000 box (freeware)
  • Installing the command line tool for WinZip, so I could write batch files to zip up selected directories
  • Scheduling — on the Win2000 box — the Zip batch files
  • Scheduling — via CRON — the jobs on the Linux box (all jobs run off the Linux box except the Win2000 directory zip, which are batch jobs. Linux is much better for scheduling and scripting [have tar, gzip, permission handlers etc all at your fingertips] )

It’s been an eye-opener.

I currently have eight CRON jobs running every night; before the crons run, I have two scheduled batch jobs on the Win2000 box zip things up.

Pretty cool.

And the best part is that I wrote these a month or so ago, and I just let them go. And they keep working. (Yes, I do check that they ran, and occasionally try to “restore” from a backup: never failed yet).

This was a lot of work — simply because a lot wasn’t in place (FTP server etc), but because I do have at least passing familiarity with the crontab, scheduler and so on, it was pretty straight-forward. Lot of work; lot of time — but no “deal breaker” dead ends. Just busy work, to some extent. I would figure I’d need this or that; I’d do it. No biggee.

Sometimes being an inquisitive geek pays off.

mySql

I wrote — over a year ago — that I was “thinking about databases” and all that.

That train of though turned into a guest editorial on the subject of open source databases vs. commercial products.

It was interesting to write — made me think — and, of course, the response from readers was the really interesting part.

Sure, I got flamed, called an idiot and all that, but there was a lot of knowledge and experience behind the responses in many cases.

Basically, the article said “Open source solutions are in many areas comparable or better than commercial products, but this is not true in the case of OSS databases. Why no outcry (or am I missing the outcry)?”

And — basically — the response from readers was that what is out there is fine; the options offered by commercial products were just not needed or could not be cost-justified.

Wow. Blew me away.

Because the most widely accepted/deployed OSS database — mySQL — is really a piece of crap. It doesn’t pass ACID tests, it’s filled with proprietary (instead of ANSI-complient) SQL (such as the “concat” operator! Scary…), and does not support a lot of the things that make it a database.

While there were dissents — and those who said that, yeah, Oracle is good, but I’ll never use the 10 million configuration options offered (fair) — the general response was that mySQL is just fine for the job.

Basically, people are using mySQL — and other OSS databases, such as Postgres, SAP etc. — much like flat files. Just a big table or two; maybe joining the two in some cases. Very denormalized. The advantage to using a database instead of a flat file even in this case, of course, is that one doesn’t have to write the logic to extract/order/limit the data pulled from the “data store” — SQL is used.

And then you can extend it later — add another table etc — very easily. And — importantly — without changing/adding any business logic.

And that is a good thing.

But it was just a bit scary to me: I had thought that the OSS crowd, in general, was more sophisticated about databases than that. I got notes and there were posts from people who had been doing this for years and saying, basically, that they don’t use primary keys and so on.

Again, the “relational flat file” syndrome.

While I agree that many projects do not need the weight of an Oracle or MS SQL installation, but one should still adhere to good database design and usage no matter the product used. It just seems odd — and surprising — to me that the users of OSS software don’t seem to put a lot of stock in these “best practices.”

On the other hand, I’m judging from the people who posted. And those who posted — or wrote — are probably always going to be those who disagree, not the ones nodding their heads and thinking “yep….”.

All in all, an interesting project (the article/responses) overall.

Where’d I go?

Damn.

Can’t believe it’s been a year since I posted here — actually, almost 15 months.

I do remember reading something somewhere recently (/.?) that mentioned an article that correlates the rise in blogging with the rise of unemployment among the blogger types — techies.

Makes sense, and sort of works here.

But whatever. Onward.

========================

I’ve finally gotten around to redesigning/recoding the littleghost.com site.

When I got the domain back in July 1997, I spent a weekend putting together a look and feel and all that….and pretty much have not changed it since.

Sure, I added sections here and there over the last five years, but I never really touched the GUI. Added a touch of a style sheet and so on, but nothing remarkable.

So I have begun the process of recoding the site. I’m trying to accomplish the following:

  • Slowly bring the look and feel of the separate sections together
  • The look and feel will be HTML 4.01 compliant and pass the W3C tests for HTML and CSS. Style-sheet driven site
  • The coding should be XHTML complient, as well. This will take a bit more work, replacing tables and BR tags and so on
  • Make it look virtually the same in IE6 and NS7 — those are the only browsers I’m really worried about. (Note: The site will not render well in NS4.x, because of that browser’s poor CSS support.)

As always, this site is really for experimentation and so on — it’s not supposed to be a real site that people really want to visit. For all the servers and so on I have locally, having them remotely is different.

For example, there is some bug at Concentric that does not allow the inclusion of (or, at least, acknowledgement of) a style sheet if the doc type is html 4.01 strict. Replace with HTML 4 transitional, and all is fine. Weird. I have to figure out just what is happening there.

So, currently, I have the style sheet called from geistlinger.com, and it’s fine. Go figure. Works fine locally on NT (Win2000 pro) and Linux (Apache). So I dunno. More things to check into! Oh boy….

Conversion is going well so far; I’m glad I waited until I had a little more experience in HTML 4.01 coding before converting — it’s not really as straight-forward as you might think, especially when you approach it (like I do) with an HTML 3 & HTML 3+ mindset. Still hard to think of DIVs and not TABLEs, how to align, messing with the inheritance issues of CSS styles and so on.

It’s been a nice learning experience.

So far, I’ve converted over the main page, the postcard section (for the most part — large CGI rewrite necessary, as well) and the Term Glossary (need to import a new version of this from my Linux box).

I have not decided whether or not to change this area — Blog This! — to the new format. Would be a good exercise, but the first issue is functionality, and I don’t want to mess this up just for uniformity in looks. The looks will get there; I have to make certain the functionality is not affected.

==============

Other than that, I’ve been doing a lot of coding, from Perl through PHP to Cold Fusion. Database work has been relatively light recently, just a stored proc here and there, some tweaks as new sections need it and so on.

One thing I did spend several days on is using Google’s open API as a Web service. This rocks.

Basically, I can make calls to Google’s database and pull back the results to my (Linux) machine and massage the results as I see fit. It’s done via a SOAP wrapper and a local WSDL style sheet (provided by Google).

We’re talking a Web service. And it works. How cool is that?

I’d love to publish it out here on littleghost.com, but the necessary SOAP wrapper (I wrote the program in Perl) is not available on either of my domains — so I can only run it on my Linux box. Still cool….

Amazon has a similar program going; I have to try to see if I can get that to work. Maybe this time I’ll do it in PHP (need the PHP SOAP wrapper for my Linux box, however…).

Lot’s to learn out there, and the industry leaders in services are turning out to be companies like Google and Amazon, and not the players like IBM, Sun, M$ and so on. Interesting. While the “real” players (IBM etc…) will catch up quickly, I think it’s interesting that the pure players — the “all Web” players (Google etc…) are really making a difference, and making the promise of Web services (which is wildly overhyped currently) a reality for the average Joe Developer to see.

You go guys….

Digging into databases

I’ve been thinking a lot about databases recently.

One of the reasons for this focus is my increasing focus on databases/dynamic sites, and my relatively recent exposure to databases.

Sure, I’ve done access and worked with sites that have run big databases, but until the last couple years I never really performed an ODBC connection.

Now I do it with alacrity and increasing frequency.

So I think more about it.

In a way, I have a good point of view: I have a solid background in Web design and, to a degree, application architecture. And while my database exposure has been relatively recent, I’m not a stodgy old man on this front: No, I’m not an expert in databases. But I sure wish I was. They change everything for a Web developer.

With all the tools I can bring to the table, I’m able to see the flaws and strengths of most databases — or at least the database/application nexus.

What I’m seeing, overall, is the following:

  • Badly designed databases (I’m as guilty of this as the next, but my mistakes stay on my home computer, they don’t run actual businesses).
  • Bad databases for open source (mySQL & postgresSQL, though both have their good points, as well: mySQL — installed base; postgresSQL — great database overall but for lack of tools).
  • Stupid, awkward database connections by programming languages (Perl, PHP, ASP for example). I’m spoiled: Even if you hate Cold Fusion — and many programmers do — you cannot deny that it is the easiest to connect to a database. Set up a DSN in an admin tool, and then to run a query and get results have to do all of this:


    select * from table


    How hard was that?

  • Uncomfortable database/application solutions. Use a Cold Fusion example again: To me, the best solution here is to run Cold Fusion on Linux (faster than on NT; way cheaper; more reliable) and MS SQL Server on NT/2000. Virtually no one does this. The platform wars are still hurting us, although things are getting better.

What tech will stand the test of time?

OK, we can do this the long way or the short way.

Right now, I feel like the short way. I’ve discussed much of what I will give a “cash” or “trash” rating to before; now is not the time to go into detail for this. This is spozed to be a quick overview.

The question is, what technologies will survive in a meaningful way (yes XXX database is the darling of 5 billion sites. With total incomes of US$5.00 total. Great database, “trash” on this list). The question comes down to a business decision; can it help us fiscally? Is it something that one should spend time learning (Java, yes. C#? Unclear. FORTRAN. Probably not…)? CASH. If not, TRASH, however cool. One has to establish a fulcrum (however bogus) and work off this basis. Sorry.

  • Open source software in general — TRASH: With the exception of Linux, will be marginalized even more than it is today. Big players — i.e. Microsoft, IBM, Oracle and so on will come up with solutions that match or excede OSS, and they have sales forces and marketing money….

    Application environments

  • PHP — TRASH: It will endure and grow, but unless something fundamentally changes with the Web, it will never be the powerhouse ASP or JSP are. I actually have trouble with this one, because I both like PHP, and I do see some sites moving from JSP to PHP. Interesting; should be the other way around…or should it? That’s why I have trouble. PHP has overcome the intitial “cool but so what?” worries and is now a solid language (v4 big leap), but it still suffers from one huge liability: Database issues. 1) Like ASPs, I hate the mechanism to hook to databases in PHP (yeah, Perl is even worse); 2) One goes with PHP because one is running *NIX. What databases? MySQL is No. 1 open source, but really not a mature database. PostgresSQL is much much better, but no tools, no installed base and bad business decisions (Great Brigde be gone). Leaves Oracle, which rocks, but costs a bundle. Which almost defeats the point of using free software for developing. My guess is that there will be a lot of small sites — much like Cold Fusion sites — by small companies or as intranets using PHP/mySQL. Nothing to speak of enterprise, however. Where I might be missing the boat is seeing that PHP is slowing becoming the tool of choice for Perl coders who need a more “HTML friendly” language. PHP combines HTML hooks and uses a lot of Perl (and Java) functions/regular expressions and so on, so this could well be true. I’m just making this up now, but a solid case could probably be made for this.
  • Cold Fusion — CASH: Not to a large degree, but as Web sites get more and more dynamic — a year or so ago a database-drive site was the exeception, now it’s the norm, and that is what CF excels at. I expected better things from v5.x; right now looks more like a minor point update, not a full integer upgrade. Performance boosts and stuff that PHP had before. It will be marginallized vs. ASP and JSP, but will still power many sites IF it keeps growing (the ease of database access and coding is why it rocks; if this does not continue to improve as other technologies do, it’s doomed.) Serious geeks will always consider CF a tinker-toy language, which is true — and that’s its selling point. Don’t need a degree or team of technologists to create a dynamic intranet site. One drawback to CF is the same issue with PHP: Database issues. Yes, CF runs against all databases, but most deploy it against MS SQL Server. Which is great, actually. But, as mentioned above, open source databases largely suck. So users have the choice of CF on NT with SQL Server, or CF on Linux (say) against a MS SQL Server database on NT. The latter has better performance, higher reliability, lower cost. And the former is almost ALWAYS selected, because it’s easier. One OS is good; IIS comes bundled with NT… Sigh.
  • ASP — CASH (big time): Face it — it’s a MS product, it is blazingly fast because it is all native (NT/IIS), nice COM and DCOM hooks. Lot to say for it. I don’t like it, but I’m not a Microserf. Powers a lot of sites, and will continue to do such. It’s partly a perception problem — IT managers will have a tough time selling, say, PHP or Cold Fusion to mangement — management has no clue (“What’s a pee-ache-pee?”). Can’t go wrong saying “all Microsoft site,” won’t have to worry about compatability, one sys admin can take care of it all and … yes, Microsoft will be in business in 10 years. Will Allaire — oops! — Macromedia? Will PHP even exist in 10 years?
  • JSP/JHTML — CASH: This is a tough one, because — as I’ve mentioned — I’ve see some indications of people moving from JSP to PHP, which is contrary to what I’d expect. That said, I still think it will grow, because data-driven sites are mandatory now. JSP (or JHTML/servlets) is UNIX’s answer to ASP. Those with big Sun boxes don’t have too many choices — PHP, Cold Fusion, Perl/Python or some Java solution. The first two are really not scalable as currently offered (I doubt Cold Fusion will even be really Enterprise ready — reads, yes. Writes, no). Right now there seems to be a split between a home-brewed Perl/Python package (Mason and mod-perl has helped this) and Java solutions for the big companies on UNIX. I see Java rising to the top shortly, if for no other reason than the rapid climb of middleware such as Websphere and Weblogic — Java tools/solutions for UNIX. This part of the market will get even bigger in the futures, especially the higher up one goes on the Fortune XXXX list. (Ironic note: Microsoft is always dinged by detractors for closed, proprietary products. While true to a degree, the MS platform — simply because of its shear ubiquity and relative [to Sun/Oracle, say] low cost — offers more development options. MS will run Oracle, mySQL and MS databases. Will run Cold Fusion, PHP, JSP, app servers and so on. While performance might not be as good — or security as high — there are options.)
  • PERL — CASH: Larry Wall once said Perl was “the duct tape that holds the Internet together” or something like that. This becomes less and less true every day, but it still is extremely valid. Perl is everywhere, even if not a development language (I personally think PHP will be the new Perl for Web dev). It’s doing data transformations, image processing, imports, exports and so on. It will never go away. It’s that good. It can never hurt to know Perl, even in an NT environment. Yes, gone are the days where a dynamic Web site was a Perl CGI opening a flat file and returning results, but Perl still has a very big place in Web developement, even if it is not front-end developement. Please don’t build your site in Perl; please learn Perl.

    Databases

  • MS SQL Server — CASH, bigtime: If you buy into the MS message, you will be running SQL Server. There will be some folks who will — for some personal reasons — run Oracle on NT, and there will be those who — for cost savings — run mySQL (or even Access!!!!) on NT as well, but for any serious Web development, you run SQL server if you run MS stuff. And this is personal, but I think SQL Server is the single best MS product. Really. And v7 made enormous strides over v6.5, v2000 is supposed to be better still, even if not the quantum leap that 0.5 version leap indicated. If MS ever ported SQL Server to Linux — which I seriously doubt — it would instantly become an enormous hit. I still think there will be a lot of sites running either PHP or Cold Fusion on Linux against a SQL Server database. This is a great combo — the best of both worlds. It won’t happen as much as it “should,” simply because most places are not forward-thinking enough to run in two environments. And to be fair, it is more daunting to do so.
  • Oracle — CASH, bigtime: Oracle is the No. 1 database for operations where money is no object. That’s about how much they charge, but they have gotten better with the “i” products. While most people don’t need the 2 gazillion tuning options Oracle offers, some do. While MS SQL Server 2000 is making some serious inroad to the true enterprise market (think Amazon.com, Dell.com, Ebay.com), Oracle is still the standard bearer. Also, if you are running on UNIX — which most of the enterprise market did prior to a year or two ago, Oracle was basically the only real choice, unless you were an IBM house.
  • DB2 — CASH: I really don’t know enough about this to say much, but it’s an IBM product. And IBM has really gotten it’s Internet act together over the past couple of years. DB2 is their database of choice — because it’s their database, but they do support other databases in their WebSphere development product. My guess is that most use Oracle, unless they are a “totally IBM” shop. But it’s IBM. They will survive. DB2 will survive at least in the short run just for this reason, even if it is a bad database — which I don’t know one way or another.
  • Sybase/Informix — CASH: Informix is tough, as it has been purchased by IBM. What does that mean? Unclear. Replace/augment DB2? Wither away and get those users on DB2? I dunno. My guess is that both will maintain a presence, mainly at enterprise-level companies, but they will not be the real movers and shakers. Like COBOL, there will always be a need for it. Yet each year it will shrink, and few – if any – brand-new installations (new installation in company without existing products running the DB) will appear. I really don’t know much about these guys. I could be way off base on this one.
  • mySQL — TRASH: This is not a good database, though recent attempts have made it better. It does have the installed base crown for OSS databases, but … so? I have been reading more and more articles about how to hook PHP (the basic dev tool for mySQL, along with Perl I guess) to MS SQL Server. Which would have been unthinkable a year or two ago. OSS with MS?! Are you mad?! Times change; more sophisticated sites need the support only a database like MS SQL Server can give it. These articles are on OSS sites, too. And the articles’ message threads have more “Help! I’ve having trouble doing this!” than “MS sucks!”. Draw your own conclusions. mySQL will probably never go away; it will become the Access database of the OSS world. But it’s not going to reach any higher than Access has already stretched, which ain’t saying much.
  • mSQL — TRASH: OSS database decisions often come down to mSQL and mySQL. The latter always wins (people don’t know about PostgresSQL). mSQL? Stick a fork in it.
  • PostgresSQL — TRASH: This one is tough for only one reason: Because it is the database solution now offered by Red Hat to help the company give businesses an alternative to Oracle. But Great Bridge — the company that employed many of the Postgres founders and coders and tried to sell a commercial version of the product — is gone. The pinheads refused to partner with Red Hat. So Red Hat said, “OK!”. And suddenly Great Bridge is … unnecessary. Ouch. Back to the point — because of Red Hat Postgres might endure, but I don’t know. It’s a great, stable database, but it came too late. The world was already carved up between “I want an open source solution” (mySQL wins) and “I need a REAL database” (and people go with either MS SQL Server or Oracle, depending on need). There is no need for a REAL database that is also OSS, unfortunately. I like Postgres; I run it. It will go away slowly….
  • Access — CASH: No, this database does not really belong here, but let’s get real: Most companies run on a Windows network and have Windows desktops. All can run Access. Access is easy to hook up to Cold Fusion (good intranet dev environment), can also — with more work — hook it up for PHP. Access if essentially free (comes with all business machines, basically) and is a very easy database to use. So it will be. NOTE: There were a number of sites out there — such as Chrome Data — that actually ran a complex Web site off Access. Those days are pretty much over — users moving to SQL Server — but it still can be done, and will be to a certain respect. Take my site, for example. I get three hits a year. Access database support is $10/mo; SQL Server is $20. Gee, which will I pick????

    Webservers

  • Apache — CASH: This is another odd OSS “cash.” While Apache does power the majority of the Web sites out there, the percentage is shrinking (IIS is replacing it, for the most part), the sites that it does power are small sites — small business sites, hosting companies that run only Linux and will be out of business in five years (not because of Linux, but because of mind-set) and such. Virtually all large, highly transactional sites run Netscape. Those that don’t run IIS (such as dell.com). Apache is true OSS and no one really makes money off it except O’Reilly publishers. But it’s a great server — complex (but not for UNIX heads), flexible, fast and secure. One concern is why we still don’t have an Apache v2.x. Still in v1.x. Yes, it’s open source, and no one gets paid for this (some company is trying; I can’t recall it’s name — may be outta bizness). Apache is also available for NT, so lots of sysadmins — UNIX dorks familiar with Apache — will use that instead of the notoriously insecure IIS. For the most part, this will be small businesses and intranets, but there will always be a market for Apache on NT, as well. Just won’t pay.
  • Netscape (iPlanet) — CASH: Still the standard-bearer for enterprise-level servers with high transactions, even the Sun/Netcape iPlanet fiasco can’t dethrone it. Netscape rules in the UNIX world and has some presence — but not as much — in the NT world. One thing many users don’t realize is that the iPlanet FastTrack server is free and a good approximation of the very high priced Enterprise edition. If you run Enterprise in the workplace, run FastTrack on your laptop or whatever. Great server; easy to administer. Better than Apache in this respect, simply because I’m a fan of running the same software all over so things can translate well. Learn how to set the primary document directory on FastTrack, you’ll know how to do it on the Enterprise product.
  • IIS– CASH: Wow. Has IIS gotten beaten up lately. The Gartner Group even suggests replacing it. People won’t. It comes with NT, integrates well, has same interface as other NT tools/products. Until hit, people will not abandon it. Also — if you’re running ASP — you’re fucked. That it pretty much your only choice. (That’s how MS gets you…). If you don’t run ASP, I don’t see any reason to run IIS except that it comes with NT. So what? Apache is free and faster for non-ASP things. But people won’t do that, I know…that’s why it’s a CASH.
  • All other servers — TRASH: Yes, some have their place and all that, but these three run 97% of the Web. Any questions?

These are currently the three big pieces of Web development now; it will be interesting to see how wrong I am in the short and long run.

I deliberately ignored (with some references) middle-tier products, as well as XML and other transformation languages. I wanted to focus on the basics as they are today.

Yesterday there was static code and Web server.

Today there is dynamic code, database (for that dynamic content) and the Web server.

Tomorrow?

Check back…..

9/11 – first, thin thoughts

This is the beginning of the first weekend following a week that will, like Pearl Harbor, live in infamy.

Yes, it is the Saturday following the Tuesday bombing — there is no other single word for it — of New York’s World Trade Towers, the Pentagon and a failed attempt to steer another hijacked plane into another Washington, D.C. site.

This is a tech blog, yet let’s face it: No one — on TV, on stage, on the Web — can write/talk/think about much except this. Half of Slashdot was the bombings for day — and that’s geek central. Says something.

I personally escaped the personnel impact of the terrorist attacks — I did not lose anyone or my own life in these acts.

I did lose a job, however.

I was supposed to receive a job offer on about Tuesday from a large corporation. Due to the bombings and the uncertainty of what lies ahead, this corporation established a hiring freeze. So I was frozen out.

As I explained to the corporation, I don’t like it, but I certainly fully understand. Things have changed, and no one is quite certain of what way they have changed.

And I have this totally in perspective: Yes, I lost a job opportunity. Thousands lost their lives; many thousands more will be forever impacted by this day in ways that go WAY beyond my piddly job opportunity.

‘Nuff said.


One of the debates going on right now — actually, it all began shortly after the bombings — was that this is either:

  1. The first true Internet war (people say Kosovo, but the columnists scoff)
  2. The day the Internet failed

Realistically, it’s probably a little bit of both.

To a large degree, I think the Web did fail in so many ways for people, especially on Tuesday, as the bombing were occurring.

I was online early Tuesday morning, checking/answering e-mail, hitting this and that site (mainly working on a freelance project). I finally hit CNN just after the first plane hit. I remember wondering — as the page was taking forever to load — if anything was up, because CNN is usually snappy at this time.

I saw the picture, wrote an e-mail to Romy and told her I was going to watch TV.

Which I did, for 15 straight hours. I didn’t even shower until about 3pm. I just watched.

I would occassionally check CNN — and saw it go into the worst crisis mode I have ever seen: Logo and HTML text. That’s it.

Obviously, they were getting pounded.

I couldn’t even reach msnbc.com or abcnews.com. I’m sure AOL was nailed.

The Net failed a lot of people.

In a Wired.com article shortly afterwards, author Leander Kahney said that the Web didn’t fail, you just had to know where to look.

Uh, my Mom doesn’t know about Slashdot and the mirrors many folks scrambled to put up to help get the word out.

The Net failed — it did not operate in the fashion people expected. Most could not get information. Yes, the geeks could. My mom couldn’t.

It failed them.

On the other hand, there was a lot of great first-hand information out there — yes, if you knew where to look — and many people took the time and bandwidth to get the info out to people. It was great in that respect.

To me, the coolest thing that happened on the Web was the way everyone did band together to help. Some attempts were misguided to a degree — I have seen at least a half-dozen sites offering to be the clearinghouse for missing persons information (this should be consolidated in one place so people, again, don’t “have to know where to look”), but for the most part excellent efforts. Amazon, in particular, should be commended. For this entire week, the top page of Amazon — that e-commerce jugganaut — has looked like the image on the right. Totally devoted to getting donations — which they will process at their expense — to the Red Cross. To date, $5.4 million has been raised through this one effort.

That rocks.

Let’s be honest: Amazon will reap a lot of positive publicity for it, but I don’t think that will make up the lost income from people who would have gone to the site and made an impulse buy. But even if they DO end up making money off this effort, what’s the down side? Money was sent to the Red Cross, people had a place — a highly visible place — to go and help out in some way. I don’t see the downside.

What a great concept. This is the Web at its best.

Who the hell is in charge of hiring here?

Right now the economy is not in turmoil, but it sure doesn’t feel good.

The tech sector is, as you well know, feeling pretty weak right now: Reports out today have indicated that, contrary to some recent hopeful signs, tech has not bottomed out yet.

The Internet sector is of course feeling the effects of this downturn/rightsizing/correction/recession/callitwhatyouwill much more acutely than other tech sectors. Things are ungood.

People working in tech — especially the Internet arena — are feeling the pinch. They are getting laid off, taking on more work (to fill the shoes of those laid off/left and not replaced) and have to sit in their cubes watching the perks dwindle (options worth….worthless, no more free Mountain Dew, no more “take your dog to work day”) and staring up at the Sword of Damocles hanging over their heads. “Am I next???”

Some will be next. Some will be spared but still fear the sword. Basically, for those let go or left behind, things suck for tech workers these days.

Yes, that’s relative. It sucks today compared to the heady days of about a year and half ago, when options meant something (briefly), when your sandals and nose ring told more about your programming skills than your resume. Yes, things suck today compared to those good old days.

But, still, things are rough. While government and analyst reports say that IT is still a growing industry, tell that to the almost 19,000 workers cut by chipmaker Toshiba today, or those at Chapter 11 flooz.com.

Still tough.

And one of the lessons of this dot.com turned dot.bomb correction (or supply your own adjective) is that good ideas are not enough. There has to be a competent, forward-looking (I hate buzzwords/phrases but this one works for me) staff of individuals/teammates that make it all happen. Sure, ordering a piano onlne may be cool. But don’t give the folks free delivery…..stuff like that.

Which makes this blog’s pet peeve all the more perplexing: The overwhelming incompetence and lack of professionalism displayed by recruiters and tech companies. Maybe this extends into other industries — I think it does, from what others have told me, but I don’t have first-hand knowledge of this.

What I do have first-hand knowledge of is the summed up in the following three statements:

  1. As indicated above, IT personnel needs still outstrips supply.
  2. I’m an IT worker looking for a job.
  3. Companies/recruiters are unprofessional — they never follow through as they have promised.

I just got off the phone with a recruiter who called me, not the other way around. Spent about 45 minutes with him, giving him background and so on, before he got to the job he had called me for: Visual Basic programmer.

Yes, I know VB, it’s on the resume as that (as a beginner). Why was he calling me? Guess: Did a search on Monster or the Web for Visual Basic, I turned up, he called without doing ANY research — my resume is a (sadly unemployment-wise) non-Microsoft resume. Once we got to this, I said I didn’t think I would work for what he had; he agreed.

We wasted both of our times.

Oh — and here is the favorite part: After the agreement that I wasn’t for this job, he said he’d e-mail me and I should send him my Word resume for future reference.

I will never hear from him.

I have a list of people that I have and not heard from. It’s interesting.

People who promised to get back to me who did not:

  • Hall-Kinion recuiters (sp?) — Many interviews (phone), much work on my part to get a resume to them that they liked; I filled out their standard form promptly as requested. The ironic part of the last item is that their standard form had the following question (I’m quoting from memory): “What don’t you like about recruiters?” I answered: 1) Tech is hard to place people, it’s complicated. Tough for recruiters to understand what I/someone else needs/whats/can do. That’s awkward but understandable. 2) Recruiters promise to get back and they don’t. The recruiter actually pointed this out to me afterwards, saying he “appreciated my candor” in this matter. Yet not enough to get back to me.
  • Vermillion Group (recruiter). Couple of calls, promise to get back.
  • peoplebonus.com — I had two phone interviews — one lengthy — with one of the founders, then went in to talk with this guy and other founder (1/2 day killed for me), they then set me up to have a phone interview with acting CTO (~ one hour) and this person arranged to have me speak on the phone with the leader of the consulting group they had engaged. About 2 hours there. Then nothing. Over one full day of time, plus gas, parking etc. NOTHING. MONTHS AGO. Any questions?
  • Lucas Group: This is the person – Jeremy — who called me for a VB job today. I have yet to hear from him; I doubt I will. (I’ll try to update if I do; if I DO hear from him and DON’T update, my bad). UPDATE TUES. AUG. 28 — Nope, never heard from him. I just in shock….
  • General Employment: I went in there months ago to meet with someone; they said I couldn’t unless I did this or that (uh, I have an appointment). I left. So this one is my mistake (NOT!).
  • truepoints.com — Several phone interviews (~3-4 hours total). Never heard from them. Then I read in the themayreport.com a press release from them that they had a beta site live; I was able to hack into them easily (uh username = “username” password = “password”). I wrote to The May Report about this (they published it); I wrote to truepoints.com about this with further info/comment. No response.
  • Many (more than three) that I don’t have enough data about on hand (one a company in Northbrook; one in Elk Grove Village) or I have just forgotten. Whatever.

People who promised to get back to me and who did:

  • Accuquote.com, Northbrook, IL. They sent a very prompt rejection letter (snail-mail). Kudos.

DO THE MATH.

Does this make sense?

OK, may make sense, but is it professional? And I’m just looking for a RESPONSE WHEN ONE IS PROMISED. Yes, rejection letters are better in that they give closure (and then the persistent ones will stop buggin’ the recruiters etc) and they demonstrate professionalism. But, I’m just looking for people to call when/if they have promised such. What’s wrong with that?

Of course not. And that’s the weird part. Especially for the recruiters. Companies pay these folks to do the dirty work; maybe, say, Motorola doesn’t realize that this or that recruiter is getting people, but pissing off a lot of others. They get pissed at the recruiter AND Motorola.

And the business of IT is business.

Shakeup coming in recruiting? Doubtful. I’ve worked at too many places where the Human Resources department (i.e. in-house recruiter) was a joke.

Doesn’t seem to worry anyone enough to make a difference.

See the sword….


OK, I didn’t want to go here, but now I’ve had two stupid phone calls in the last 20 minutes, so I’m forced.

As I mentioned before about the guy calling me (!) about the VB position, recruiters don’t get it. Yes, it is hard to place people in general — what do they really want/what does the company really want? — but in IT very difficult due to its technical nature. Sure, I can build database-driven Web sites. Oh, here’s a MS ASP job for you…uh, don’t know ASP….etc…

It is hard.

But MANY RECRUITERS DON’T EVEN TRY:

  1. The guy who called me about VB (detailed above). Nothing I have that suggests that I really know VB. Keyword search will find it, sure, and it will also find that I’m a beginner…if you read.
  2. Guy I just called back who had sent me an e-mail: E-mail indictated saw my resume. Please call. OK. Job was for SALES AND MARKETING. Nothing about that in the e-mail. Wasted both of our times (and my resume indicates NO sales or marketing experience/interest/skills). Why was I called.
  3. No clue in general. Too many to detail, but much of what is above. Hone in keywords; talk to them and they don’t know the diffence betwee a static and a dynamic site, Access is a database (yeah, sorta) and what’s the difference between an intranet and the Internet?? *sigh*

Real-time analysis!!!

I have two calls sorta outstanding right now. Let’s see if they come through (I’m an mouthing-off fool) or they don’t (told ya!):

  • Nelsy from Parallel Partners called me about a job I had submitted to online. Few questions; I had more for her. Bottom line, she promised to call back before 5pm today (VERY quick if it happens, as I talked to her about 2pm). She sounded like she was calling from a call center. Interesting. UPDATE: While she did not get back to me by 5pm on this day, an e-mail did follow Tuesday Aug. 27 giving me the blow off. Unusual but good.)
  • I got an e-mail. Called and they promised to call. No names. I don’t care. They won’t. Shit. This is bad for everyone. UPDATE: No, they never got back.

A decade of Linux

Today is the 10th birthday of Linux (the release of the kernel by an obscure Finnish guy named Linus T-something).

Wow. Just had the 20th birthday of the IBM PC — the “desktop breakthrough” — and now this.

And yesterday — Friday, Aug. 24 — marked the release of Microsoft’s next generation OS — Windows XP — to OEMs (release to public still scheduled for Oct. 25, I believe).

OK, where does that leave us? I guess with some OS issues to discuss.

What follows are ruminations mainly on Linux, but the release of WinXP and the anniversary of the IBM PC will also figure as nice counter points.

  • The story of Linux, of course, is really the story of open-source software at its best (collaboration, distribution, high utility with no cost). While I have slammed the open-source movement in the past — and will continue to do so in the future — for being a little too starry-eyed (have to make money somewhere dude….), Linux is the poster child for what open-source software (OSS) can do. Kudos to all.
  • Let’s get one thing straight: Using Linux is not a slap at MS. Using Windows is not a slap at OSS. Many people — especially in OSS — appear to feel that way, but this is inaccurate. Both are tools, and they may or may not have overlapping uses. MS is the “tyrant” right now because they are so big. But — Red Hat has caused some grumblings in the OSS community over the past year. Why? Bascially, because they are the overly dominant Linux vendor. If SuSE or Mandrake had made the same moves Red Hat did, they would be congratulated, not slammed. Let’s keep this in perspective, folks.
  • Linux does compete with Windows on both the server and desktop.

    • Servers: Linux is a very real threat to MS on the server side, for the following reasons: Both run on relatively low-cost Intel boxes; Linux is more stable (it is; get over it…), cheaper (OK, spend $39.95 on a Red Hat disk) and has much better security (in this case, it’s not so much that Linux is good as it is that Windows/IIS is really weak). Linux is also faster for most applications. Downsides to Linux on the server side are many, also, however: Linux is — I don’t care what geeks say — very difficult (relatively) to deploy and tweak. NT/IIS is much easier for novices — and there are a lot of novices out there. And there is still the lingering problem — although it is fast disappearing — of drivers and such for Linux. This is not as much an issue on the server — just need Ethernet drivers etc — as it is on the desktop (printers, video cards, scanners….) — but still an issue. The one major drawback to Linux on the server, and I see this mainly at mid-sized companies, is that this creates a split environment: Need sys admins with both skills or two sets of sys admins. People costs are huge. Pay the MS license and not have to hire another worker. Much better for business. For larger companies, this is less an issue, as they are used to having a desktop help/Web admin staff split (Windows/some Unix flavor). Small companies rely on geeks to do both, who do it willingly. All things said and done, however, Linux will continue to expand on the server. This will hurt MS and UNIX companies, such as Sun. NOTE: One wildcard that I have trouble reconciling: Database server platform. Right now, MS SQL is the best product that MS has; runs only on Windows. As I’ve mentioned, OSS databases are doomed to live on only at small companies that run the database and Web server on the same box (stoopid!). Only Linux options left are DB2 (only the IBM-aligned will do this) or Oracle (many large companies on Oracle, but it costs a fortune). My choice is still Linux Web server (Apache or iPlanet) and MS SQL database. Split platform, yes. But nice option. If I had the money, an all *nix deployement with Oracle database.
    • Desktop: Again, get over it. MS will rule the desktop from at least a few years to come. As mentioned in the Server section, Linux is still too hard to configure (people have trouble with Windows; want to mess with the mess that is a Linux desktop?). And the tools just are not there, which is key. As I have mentioned before — probably several times — one has to use Word and Excel. These a business basics. Until Linux desktops have apps that can read and write these — and work in the same way — forget the desktop for business users, which is the overwhelming use of desktops. Home users? Remember, a lot of homeowners still buy Macs….. The driver problem is still here and more pervasive on the desktop for Linux, but it is getting much better quickly. Except for geeks, I cannot see any individual selecting a Linux desktop for home use unless he has the same at work (and chances of that are almost nill). Just makes no sense. Little software (is there a Linux version of AOL?), different look and feel….no, not going to happen.

  • Let’s hope Linux does not fragment like UNIX did (Solaris, AIX, FreeBSD etc.). I don’t think it will, as there is now a cohesive presence of OSS leaders who will pressure to NOT have this happen, but…who knows?
  • Linux has changed considerably in focus over the last decade. This is normal, but people don’t seem to want to acknowledge this. Yes, they trumpet the inroads they have made — clustering, multiple processors, blah blah — but don’t seem to mention what has been left behind. And it’s always a trade off. When Linux came out, one of the cool things about it was you could put it on a 386. That would be a trick today. First of all, 386s came with what, a 20-80MEG hard drive. Red Hat v7.1 minimum install is much larger than that even for the server (no GUI) version. And the 2.4 kernal (RH v7.1) takes way more memory, which Red Hat at least acknowledges such in its installer program. Most 386s had — tops — 64M RAM (I think mine had FOUR). RH v7.1 wants about 256. Ouch. But as I said, tradeoffs. Much better, more flexible, easier-to-install systems. And memory — HD and RAM — are way cheap today. But realistically, you’ll need a fairly recent Pentium to run Linux at home today. My Pentium Pro box is an old one; to run v7.1, I’d have to add another hard drive and updated BIOS (again, too fucking complicated…). And it’s a Digital box; my guess is that I can’t even get an BIOS update for this fossil. RH v6.2 might be the last installation that takes advantage of old equipment.
  • Linux is a good, solid OS, and that has helped it survive. However, a good product does not guarantee its survival: BE OS, NextStep, Mac OS (pre-10) and so on. Why did Linux survive where others didn’t/are marginalized? I think the masterstroke was the two-fold: 1) The early Linux distros could actually run on relatively cheap and abundant Intel boxes. No, didn’t need a $250,000 Sun box. Just an old 386. 2) This platform was also the platform for The Evil Empire — Microsoft. Made running Linux that much sweeter for geeks. Wipe Windows OS off the box, install Linux and you have a fast, stable server or whatever. How cool is that to a geek? Very.
  • Linux also survived due to the gentle yet firm guidance of Linus and Alan Cox (esp. the former). It keeps the issues low-key and almost folksy, even as Linux becomes a household word. There really has been no deviation from the original intent of Linux: A Unix for the masses, made by the masses, for free and kept open. This has, if anything, gotten better as the years have gone by. Yes, companies charge for Linux now, but that is for service/ease of installation/additional tools. Anyone can still download the Linux tarball and install for free.

That’s a pretty impressive first decade: From a “hobby” toy to an OS that is the flagship for OSS and that runs thousands of important sites — google.com, realaudio.com etc. — not just my home box here……

What will the next ten years bring? Interesting question.

One teaser: Different architecture of computer/OS for databases? Think about it……

What’ll come out of the internet shake-up?

In my last blog, I took a look at what the future may hold for the IT/Web industry.

Yes, a lot of conjecture, a dash of prejudice and all that, but still — I named names, I predicted failure for that which I would like to succeed (PHP, Cold Fusion).

Hell, I predicted I would be an anacronism shortly.

Which leads to a more thorough examination of the human toll this dot.com/bomb/bust/blot will take from this day on.

Interesting issue — yes, I’ve examined the losers and winners in the corporate sense, whose technology will win (in my opinion), but what will this mean for IT/Web workers?

Yes, interesting.

Some thoughts, and this is all slanted toward a IT/WEB point of view. I still think COBOL programmers will be needed in the near future; I’m not even going in that direction here. OK?

In the true American/capitalistic fashion, I will outline what I see in terms of winners and losers. One can substitute, respectively, in demand and less/not in demand for these titles, but get the drift…..

Winners:

  • Competent techies skilled with the “winner” tools I outlined in my last blog (ASP, Java — not PHP, mySQL and so on) — The more experience in one area these individuals have, the better. The Web world is getting more and more like a real business (good and bad) now, and broad skillsets are nice but what businesses really want is someone who can rock their world in the one area they have advertised for: Java developer, ASP programmer, UNIX sys admin and so on. Anything else is gravy, but must have the goods on the primary job description. The halycon days of the early Web is gone. Get over it. Sorry, I scour Monster etc. every day, and what sticks out to me is the three things employees seem to be looking for in IT: DBAs (see below), MS product adherents (C++, ASP etc), Java. Period.
  • Database administrators (in general) — Look at it this way: The sites that are making money offer products. Listed products are not stored on static pages like journalistic content could be; they are databased. The future of the Web is traditional business (yes, Web-i-fied). While static sites with company info and personal sites will not go away, most traffic — by an overwhelming percentage — will be on dynamic sites. Need a DBA. ‘Nuff said.
  • XML experts — Note that I say experts. Not like some “Cold Fusion expert developers” I’ve dealt with. Like DBA’s, XML geeks will be in increasing demand. It make take about 3-5 years to get to critical mass there, but it is coming.
  • Integration specialists — These are the consultants/FTEs like Don Drake who can leverage whatever tools (Perl and XML a good combo) to port legacy data/databases into a Web-enabled form. This is a very big deal. Similarly, the ability to port one type of “new” system to another “new” system. Real-time updates will become vital, and will require databases and processes to connect dissimilar sources. Difficult but the payoff is high. So it will happen.
  • MBAs — I wish I could say this was untrue, or — at least — say that only the MBAs that “get it” (say, Mary Butler, Bill Swislow) will endure — but I don’t think this is true. What is true? THE WEB IS ABOUT BUSINESS. Look at the “B” in MBA. Doesn’t matter if they get it, doesn’t matter if they care. They know business. Sad, true, necessary, unfortunate. All of the preceeding. Still, the Web is about business. While those who do “get it” may do better — depending on the environment — it really doesn’t matter. If the clueless MBA knows a few buzzwords: CRM, impressions, Oracle, ASP blah blah — that will sell the clueless above. *sigh*
  • Security specialists — No, not there yet. Will be soon. Why? Because the WEB IS ABOUT BUSINESS. Lose a customer and lose money. That is not acceptable to a company (losing customer…who cares? losing money, however…fix!). This will really ripen as more highly-publicized accounts hit the media. Sure, a company can — and will — lose a customer and dollars here and there because of security, but if it is publicized, that’s a nightmare that most companies don’t want to even think about. Costly. As more and more business moves onto the Web, spending $$ for security personnel will increase proportionaly (actually, probably more quickly). I personally think the .NET initiative will fuel a lot of this security upgrade, especially when and if Hailstorm comes to fruition.

Losers:

  • Those with breadth of skills, not depth — Yes, this includes me. Damn. Fun to chat over a beer or two about experience with Gopher and Archie and Veronica…but does that have any business use? Nope.
  • Open-source isolationists — Yes, there are enormous benefits to using only open-source software. There is at least ONE compelling reason to not use open-source software: Every clueless CEO has heard of Microsoft and knows they have a support network; he has never heard of KDE, Mandrake, NuSphere and so on. The Web is business now (repeat after me, the Web is business now….the Web is business now…). Get it?
  • Innovators — There will be many exceptions to this rule, but for the most part those that innovate outside a large company have one of three futures:

    • Bankruptcy
    • Assimilation into a larger company (with a marketing department..see MS…)
    • Niche market marginallity (is that a word?)

    Sorry to be a pessimist, but that’s life. In the early days a Real Player was cool; now THE WEB IS BUSINESS. Standards.

  • Those that get it but don’t get business — These folks (I count myself among this group) will not be, in Nikita Khrushchev’s words, “crushed” — but I/we/they will be marginalized. Webmonkey.com exists today. No one reads it. Learn that lesson.

Please note that these winners and losers are generalizations. Yes, there will be moronic Java programmers that go hungry and psychology majors heading e-commerce sites for Fortune 100 companies. Whatever. I’ve tried to identify trends, not one-to-one relationships. Do I really have to say that????

And note that I’m not necessarily happy about all this. This is a report/opinion/projection. It’s not a wish list…..