Second monitor

Being so much at work during the last few month, I’ve noticed that many IT guys enjoy working with a two monitor setup.  I never paid much attention to that fact and thought that those really need a second monitor are a few and that its mostly the show off for the rest.

Last week, in a very spontaneous move, I decided to try it out.  We had a few of those 19-inch AOC monitors around, so I wasn’t exactly robbing anyone or anything like that.  Within minutes I had unwrapped, connected, and configured in my Gnome, and I have to say that that is one of the best technology experiences I had in the last few years!  It’s totally awesome!

Now, having two monitors configured as one huge desktop, I can either keep my browser separated from my consoles, or more code than every before in front of my eyes without switching virtual desktops, or have all my instant messaging at hand without polluting my main workspace.  That’s brilliant, I tell you.

Downsides?  Yes, sure.  I haven’t yet learned to handle the setup properly, so I have to logout of my graphical interface and log back in every time I take my laptop home.  It would have been so much easier if just plugging the monitor in would work.  I hear that a docking station might improve the situation, but that remains to be seen.

And what I want now?  More monitors.  I’d love to have another monitor at work, and I’d really want to have at least one more at home.  But there is no place to put it at home (I’m working on a dining table), and I’m not sure there is a way to connect two additional monitors to a laptop at work.  But overall, multi-monitor setups is definitely an area I need to investigate more.

Awaiting….

Open source software activity usually bumps up quite a lot before and during Christmas.  This time around I am waiting for:

What are your waiting for this year?

On remote logging with syslog

We’ve been doing some interesting things at work, as always, with yet more people and Linux boxes.  And of the side effects of mixing people, Linux boxes, and several locations is this need for some sort of centralized logging.  Luckily we have either syslog-ng or rsyslog daemons installed on each machine, so the only two issues seemed to be reconfiguration of syslog services for remote logging and setup of some log reading/searching tool for everyone to enjoy.

As for log reading and searching, there seems to be no end of tools.  We picked php-syslog-ng, which has web interface, MySQL back-end, access control, and more.  There were a few minor issues during setup and configuration, but overall it seemed to be OK.  I also patched the source code a bit in a few places, just to make it work nicer with our setup and our needs  (both numerical and symbolic priorities, preference for include masks over excludes, and full functionality with disabled caching).  In case you are interested, here is a patch against php-syslog-ng 2.9.8f tarball.

Once everything was up and running and we started looking through logs from all our hosts in the same place, there was one thing that surprised me a lot.  Either I don’t understand the syslog facilities and priorites fully (and I don’t claim that I do), or there is just too many software authors who don’t care much.  Most of our logs are coming in at priority critical.  Even if there isn’t much critical about them.  Emergency is also used way too much.  And there is hardly anything at debug or info or notice levels.  (RT, SpamAssassin, and many other applications seem to be using critical as their default log level).  Luckily, that  almost always is trivial to fix using either the configuration files or applications’ source code directly.

On laptops in the classrooms

I came across an interesting opinion by David Cole regarding the use of Internet connected laptops in the classrooms, during lectures.

study found that laptop use was significantly and negatively related to class performance

While I was reading the article, I kept nodding my head a lot.  Yes, if I was back in college and I could have an Internet connected laptop on my desk, I’d be even worse of a student than I was.  YouTube, forums, emails, Twitter, and a whole lot of other attention grabbers would not leave much for plain old college education.  At least in my case.  I know.

But then, I started thinking if that was true for other people I know.  And I couldn’t be so sure anymore.  A few guys I know literally can’t stay for too much long wihtout a computer and some sort of Internet connection.  It’s like food or oxygen – they just have to have it.  And when they have access to a computer, it’s often amazing to see them use it.  Lots of interesting, topic related stuff coming up.  Fact checking.  Exploring the topic deeper and wider.  With quotes and all.

And that got me into this idea of a new generation.   Younger people, who grew up online.  Web is in their blood.  A desktop computer as an ugly concept, and an offline computer as a useless box.  This kind of people.  I don’t think they would be much distracted.  In fact, quite the opposite – I think their grades would go up with better Internet connection and laptop-friendlier environment.

And that’s where I started worrying a little bit about the studies that were mentioned in the article.  These studies may be very accurate now.  And they are performed by bigger universities and colleges.  The results of these studies will take a few years of propagating into smaller colleges and universities.  And that’s where the problem will arise.  By that time, most new students will of the web native generation, but their alma maters will be choosing to disconnect them and ban their laptops.  Even though it probably won’t be too relative by then.

But then again, isn’t it like this most of the time?  I think it is.

On software testing

The software is checked very carefully in a bottom-up fashion. First, each new line of code is checked, then sections of code or modules with special functions are verified. The scope is increased step by step until the new changes are incorporated into a complete system and checked. This complete output is considered the final product, newly released. But completely independently there is an independent verification group, that takes an adversary attitude to the software development group, and tests and verifies the software as if it were a customer of the delivered product. There is additional verification in using the new programs in simulators, etc. A discovery of an error during verification testing is considered very serious, and its origin studied very carefully to avoid such mistakes in the future. Such unexpected errors have been found only about six times in all the programming and program changing (for new or altered payloads) that has been done. The principle that is followed is that all the verification is not an aspect of program safety, it is merely a test of that safety, in a non-catastrophic verification. Flight safety is to be judged solely on how well the programs do in the verification tests. A failure here generates considerable concern.

The above was written by R. P. Feynman, in Feynman’s Appendix to the Rogers Commission Report on the Space Shuttle Challenger Accident, 1986. More than 20 years ago. Much recommended reading.

Found via Richard Feynman, the Challenger Disaster, and Software Engineering.