Monday, June 25, 2007

Widescreen Still Cameras

For Christmas, I bought my wife a Panasonic DMC-LX2K digital still camera. This camera has many interesting features—including a Leica lens—but one pushed me towards the purchase: it's native widescreen sensor. Unlike most cameras with a 16:9 setting, this camera does not throw away pixels when you shoot with it. Also, the camera records Quicktime movies at greater than DVD resolution (848x480). In good light you would not believe the quality of video you get out of this "still" camera.

In bad light, on the other hand, you would not believe how quickly the quality drops to atrocious. I tell everyone that this is the world's greatest pocket camera except for the lousy—low light—pictures.

But this entry is not about any given camera, but about the benefits of a wider aspect ratio in digital photography and video. If we watch our media content on 16:9 HDTV screens, and the 16:10 aspect ratio is becoming commonplace for computer monitors, why does 4:3 have a near monopoly on still photography? And will this change?

This topic was brought to mind as I was using Front Row on my MacBook to present a series of recent snapshots on my 720P HDTV. All the landscape shots fit perfectly on the screen, showing the content at its best and retaining the original composition of the images.

Photos with different aspect ratios, especially the extreme 9:16 rotation came up with some very odd, and pixelated cropping.

Or this picture of my mother at the beach in the more common 3:4 portrait image:

But even standard 4:3 images can come up missing important details like me peering over the top of this menu:

It is an implementation detail of iPhoto/Front Row that full screen images are cropped to fit (this is for straight viewing of the library, and not slideshows where you have more control). Apple could just has well have pillar boxed the images, and they probably should have even if this removed detail. This does not negate the fact that the best aspect ratio for photos which will be displayed on an HDTV is 16:9, and 16:9 is a good compromise for full screen display on the 16:10 aspect ratio of newer Apple monitors. It's certainly better than 4:3. And 16:9 is better for the same reason it and wider formats are used in movies, it's easier to tell stories with pictures if you have room to work with; not having your subjects jammed together in unnatural intimacy. People occupy space.

Why is 4:3 still the standard for digital stills if it is no longer the standard for digital video? I had thought the reason was the dominance of 4:3 friendly printing media in that you can't go to Staples and get widescreen photo paper, but the most common photo size is 6:4 which has an aspect ratio of 1.5 halfway between 16:9's 1.8 and 4:3's 1.33. Anyway, most digital photos are only seen onscreen and screens are getting wider; all of Apple's monitors and laptops use 16:10 displays. The iPhone doesn't follow this trend with a 3:2 aspect ratio; probably a compromise between esthetics and comfortably fitting in the hand.

I suppose manufacturers feel the buying public isn't ready for the change. They've shot 4:3 since the days of glass negative plates and probably don't feel the need to change. Also, I would think (but don't know) that the optics have to be more sophisticated to project such an oblong image without chromic and other distortions. But the public will change. They will tire of the whole shoddy experience of looking at their hi-tech photos on their hi-tech TV and grab onto solutions.

Consumers should buy 16:9 equipment when available and compose and crop their shots in future proof 16:9. Imagine 20 years from now how quaint all those 4:3 pictures you took last month will look. Anybody will tell at a glance they were from a bygone era—of tail fins, rotary phones, and poodle skirts.

Thursday, June 21, 2007

Alton Brown Uses My Software

My TiVo has a Good Eats season pass, and you can bet there is one thing that will get me to pause, rewind, slo-mo and call the wife in from the other room: when he brings out his LX-Proscope and starts looking at the fine details of yeast, sugar, or most recently pretzel salt. Why? Because I wrote the Luxus software clearly displayed on his PowerBook. It was my first project using the Qt framework, and it turned out pretty good for what it was; I just wish they let had let me make a Cocoa version. Somebody told me they've also seen it being used on CSI. How cool is that?


I bring this up because I was having a hard time finding Mr. Brown's e-mail address to correct his description of what constitutes an acidic solution. His chemistry knowledge is usually good, but not this week.

Wednesday, June 20, 2007

Which Mac OS X should I Support?

A critical decision any developer of Mac OS X software has to make is where to draw the line between users we can be troubled to support, and which users we can tell to come back when they have a newer operating system. It comes down to number of users, and effort to support. First, the numbers. According to Steve Job's recent WWDC keynote there are 15 million Tiger users, 5 million Panther users and 2 million users of older operating systems.



This is for the overall Mac community; the market segment you intend to develop for may be different. Maybe you are targeting the elementary school market where 5 year purchasing cycles create huge pockets of happy 10.2.8 users (I guess). Or maybe you are targeting the top end photo editing business where the release of universal binary Photoshop has released a maelstrom of hardware and software purchasing the like not seen since the world was young (another guess). Temper these absolute numbers knowing somebody who hasn't shelled out money in 6 years to buy a new OS is unlikely to buy third party apps. Regardless, these are the numbers we have and what we'll use.

Apple itself is always happy to tell its developers to forget about users of older releases--not in so many words. If these users find they cannot run some new application they will gradually feel the pressure to upgrade, and hopefully upgrade to new hardware, but even selling a $129 OS upgrade to some iBook G4 user is nearly pure profit to Apple. Also, Apple wants us to integrate new OS features: hard to do while still holding onto 10.2.8 support.

There are over a million people out there running 10.2.8 and some small number running 10.1, and some unlucky few running 10.0 who couldn't be bothered to get the free 10.1 upgrade. Should developers support them? No.

Why?
  • You want to use invasive technologies like Core Bindings which are 10.3 and above.
  • You won't be able to debug and still use the latest developer tools.
  • You are going to find that these people are 5% of your user base but manifest 50% of your bugs.
  • Your graphics are going to look awful, trust me.


Realistically, only a product with a near 100% installed base and a huge development team like Acrobat Reader would even think about supporting 10.2, and look even Adobe's minimum OS is 10.4.3.

OK, well what about 10.3? Those 5 million users are awfully tempting. Yes they are. And I'd still lean against supporting them with any project you are starting today. Why?
  • Debugging, while possible, is hard. Maybe you are the master of command line gdb or remote debugging, but I am not.
  • By the time you get finished October will be here and those 10.3 users will be squished into a smaller slice by Leopard.
  • There are many 10.4 specific technologies in Cocoa and Core Foundation that I'd like to use.
  • A one or two person team only has so much time, and there are a lot of better places to put your effort.


So my advice is to target 10.4, and be thankful you never had to support Windows ME. The horror, the horror.

Friday, June 15, 2007

iTunes Plus - It Doesn't Nauseate Me

I opened up iTunes yesterday and navigated to the iTunes Plus page. I was offered with the opportunity to upgrade my 21 EMI tracks to the new format for $6.30 ($0.30 a piece). I accepted the offer. I'd like all the music companies to let their music use this format; and this is how we consumers vote, with money. Not that Apple's DRM has hurt me. About once a year, my iPod forgets it's authorized to play my tracks, requiring a sync; but otherwise it's been transparent. What I like is the knowledge that my music is now truly my music, and the quality.


I know some people believe an ordinary person can't tell 128 kbps from 256 kbps AAC files, and this may be literally true when comparing 10 second clips in one and 10 seconds in another. The difference is in the long term.


Backing up. In one of my previous incarnations, I helped write the ill fated OS X client for MusicMatch. Ill fated because Apple came out with iTunes and it's hard to compete with free.


People at MusicMatch were used as convenient guinea pigs to determine just how much you could compress various styles of music before you started to hear the compression; the more compression the more MusicMatch could save on bandwidth costs. Too much compression, and people would hate it. The goal was to deliver good sound, but not wastefully good sound. We'd put on big noise isolating head phones and listen to several clips at various compression levels, plus an uncompressed version, straining to hear the difference. Over and over for an hour. I became reasonably good at catching the differences, picking out the instruments especially susceptible to distortion, those that rattled, scratched or were high pitched went first; the fine details lost; things like bass guitars not so much.


And sometimes I couldn't tell the difference. But one thing I can tell you about overly compressed music, even that which I couldn't discern from the original: it made me sick. Not throwing up in the waste basket sick, but mildly nauseated or at least a general feeling of non-wellness. At the time, I thought it was just from having to listen to the same Backstreet Boys riff too many times, but I have a new theory.


Music is compressible because the human mind can be fooled. The AAC and MP3 codecs use psychoacoustic tricks which make the mind think it's hearing what it isn't. The actual sound waves don't look much, if anything, like the original; the details are faux. For me, this trickery disorients and tires my brain, and the more compressed the more intense the feeling. The less compression, the less trickery, and the more wholesome the music. Not that 128 kbps AAC is as bad as 96 kbps MP3, I'm just saying I can barely feel the unnaturalness of it. Similarly, I hate listening to stereo music being manipulated by the Dolby Pro Logic filter through my SphereX surround speakers, a few seconds of that and I'll be diving for the remotes 2.1 button and its refreshing clean sound.


I'm listening to some Norah Jones in iTunes Plus format now. Her piano sounds rich and beautiful; I love the sound of a grand piano. I feel good. The music is making me feel quite mellow. I can't tell that the harmonics of the acoustic guitar in my left ear are more realistic or the tambourine being slapped in my right ear are less distorted, but my sense of well being does. So, anyway, I paid my $6.30.


[Update: I suppose I should point out that I'm not claiming any expertise in audio compression or the psychology of sound perception. I'm an application programmer. No sensible person would hire me to write an audio codec.]

Sunday, June 10, 2007

AAAAAARG - The Sound of a Corrupted Videoserver

So, I rebooted my MythTV into Windows Friday morning. The system asked me if I wanted to setup the drivers for the new Rosewill SATA card I was using to access the 500GB video server drive. This seemed reasonable enough.

Bad move.

Somehow, Windows managed to severely damage the ext3 partition on the drive over the course of setting up the driver. Lost everything that was on it, much with no backup.


So reformat. Reboot into Windows to see if it would be gentler now that the drivers are setup. Yes. Start over again.

While I was at it, install the ext2fs modules for Windows. For the 30 seconds I was testing it, seemed to work fine showing my Linux partitions on the Windows desktop.

Thursday, June 07, 2007

Port Model Train Software to OS X - Under the Hood

This is the second of two posts on my experience helping to port Train Player to Mac OS X. This is about my theories and designs for using OS X application technologies in a project which shares code with a Windows C++ application. This is not a code level look at such things, but a top level view. Also, while the framework was designed from the start to be cross-platform sharing code between Mac and PC, TrainPlayer for the PC is still using its old codebase, and may never make use of the techniques described below.

Why Is Cross-Platform Development Hard?

It may be a myth that the Russians chose a different railroad gauge to prevent invasion, but it is not a myth that commercial OS companies avoid open standard development APIs to keep their developers their developers. Neither Microsoft nor Apple cares much for Java, with Microsoft going so far as to invent a language, C# which looks like someone took a thesaurus to Java. The careful software engineer can still write the bulk of an application in vanilla C++ with wrappers to native GUI and application frameworks. But you have to be very careful.

Rejected Paths

You can write C++ applications in the Qt framework and have it run on Windows, Mac OS X, and Linux. I've written three such applications and every day use such Qt applications as MythFrontend and a Perforce client. My problem with Qt is that I don't believe it possible to write a first class OS X application in Qt, you can make a serviceable Windows application in Qt, but then again Windows users are not known for their esthetic sense. Also, there are licensing fees for commercial development, and Qt does a lot of weird things with C++ to allow the visual design of interfaces, and as a side effect locking you into Qt.

We could have written the whole application in Cocoa with no re-use of the Windows code, using just the document schema. But, that's an awful lot of wisdom to throw away.

The Path We Took

What is more valuable, the code you write, or the code you get for free with a framework? The framework code, because you won't have to throw money and time at maintenance. Do not replicate what are given for free on a platform for the sake of sharing cross-platform code, even if it means replicating some bit of functionality on the other platforms, or horrors having the Mac version have a different behavior than the PC version. I can't tell you how many times I've heard the same excuse about how the manual will be confusing if they aren't exactly the same. Get used to it, Mac applications are similar to PC applications, but they are different and Mac users demand their differences. And, few read the manual.

For example, most applications have simple menu needs. You could setup all the menu items in TrainPlayer in Apple's Interface Builder in an hour. You could add a menu item in 4 minutes. It is not worth anybody's time putting together some elaborate framework of XML descriptions files so that you don't have to edit the menus in both Interface Builder and Visual Studio. Yes, occasionally you will forget to add a menu item to the other platforms; better that then maintaining 500+ lines of code for cross-platform menus. Such code would be especially problematic in Cocoa where many menu items are loosely wired to the first NSResponder instead of a document or an application classes. On the other hand, if your menu needs are complex--perhaps you have a dozen different versions of your application, all with different menu subsets--then its time to write yourself an XML schema and go to town. Thankfully, such was not the case here.

Of course, Cocoa makes it so easy to do just about anything in the GUI that every time this comes up you end up just whipping up a little objective C class to provide data to: the car collections dialog, the pre-defined layouts browser. If any of these did anything truly complicated, the proscribed action would be to create a cross-platform model class, but anytime the Objective-C to C++ glue code is nearly the same size as the actual C++ code, it's time to just keep that platform specific.

Which brings me to what, if anything, is cross platform here? Anything that involves manipulation of the document class and its data structures is cross platform. We created a subclass of NSDocument and had it host an instance of the C++ object which contains the actual document data structures. The NSDocument subclass does things like pass in menu commands and connecting the cross platform code to the main view in the document window.

Cross-platform GUI

Any view which is composed of non-standard content, i.e. isn't composed of buttons, text, sliders, etc, but is something special to this application is created by adding a NSView we called a LayeredView and giving it a C++ based DrawingSource. This would include the main document view, the clock, the control panel, the side view of trains in the control panel and in the toolbar, and the switch editing panel. All these views, and only one NSView class. All this rendered content, and the logic for it is entirely cross platform. On the PC, I've written a corresponding MFC class which provides the same function as a proof of concept.

The LayeredView object for each platform provides:
  • Hosting for an arbitrary number of layers.
  • Capture of mouse and keyboard events passed to the DrawingSource
  • Drawing each layer when requested by the source.
  • Popping up contextual menus
  • Zoom level support

A DrawingSource object provides:
  • A list of drawing primitives (more later) for each layer
  • An affine transformation for each layer
  • Mouse tracking
  • Key press handling
  • Determination of which contextual menu to show.
  • Idle handling

Because the LayeredView on the Mac is an Objective-C object, the DrawingSource does not talk directly to the LayeredView but forwards requests for such services as layer invalidation, size changes, and screen coordinate conversions through a C++ DrawingDelegate object which the LayeredView installs in the DrawingSource when they are initialized together. In a mixed language setup, you will have to create these small interface objects anytime cross-platform C++ has to interface with some other language.

Drawing Lists

The DrawingSource provides a DrawingList which is just an STL vector of primitive graphic operations on primitive graphic structures. For example, adding a point to the current path is an operation, stroking the current path is an operation, drawing a bitmap at a given point with a given rotation inside a given box is an operation, etc. Lists have a number of advantages over the alternative of providing a wrapper API--providing a whole slew of drawing methods like FrameRect(), FillRect(), FrameOval(), FillOval(), FrameBezier(), FillBezier(), DrawText(), ClipPath()...--as graphic toolkits often don't map well to other toolkits. The PC TrainPlayer, for instance, makes heavy use of GDI Brush objects, to which a Cocoa programmer might say "What that?" and munge together some state object which approximates a Brush. With a DrawingList you are just creating a list of platform neutral instructions for drawing something, you don't have to worry about passing through a platform appropriate drawing context, you don't have to worry about flushing too often or not often enough. There will be a single routine implemented which knows all about drawing on the current platform and it will be optimized for rapid drawing. On Mac using the Quartz API, on the PC using GDI+. You could easily imagine alternative renderers based upon OpenGL or any other modern toolkit.

DrawingLists are conducive to layered drawing. Just keep a separate list for each layer. If nothing changes in that layer--maybe it's the layer of station switches in TrainPlayer and the current update involves the cars moving--then no need to recalculate it, just redraw it.

Another nice thing about DrawingLists, which I didn't take advantage of here because of all the legacy code, is that they can be generated on a separate thread. On multi-core machines (i.e.) all new machines, this can be a big win, especially if determining what to draw is computationally intensive. And each layer could have its own thread. Graphic toolkits tend to require drawing be done in the main thread, making any preparatory step which can be done in child threads helpful.

Data Types

I prefer using cross-platform data structures based upon STL or Boost, but sometimes this is not practical. Necessity and performance needs sometimes force the use of platform specific structures. You don't want to spend your rendering time converting raw bitmaps into platform native images. Therefore, I defined a set of macros:
#if _MAC
#include
typedef CGImageRef ImageBasis;
typedef CFStringRef StringBasis;
typedef std::string FileNameBasis; // treate as UTF-8
typedef CFURLRef FilePathBasis;
typedef CGAffineTransform AffineTransformBasis;

...

#define STRING_RETAIN(nativeString) if(nativeString != 0) ::CFRetain(nativeString);
#define STRING_RELEASE(nativeString) if(nativeString != 0) ::CFRelease(nativeString);
#define STRING_LENGTH(nativeString) (nativeString == 0)?0: ::CFStringGetLength(nativeString)
...
#else // not _MAC
#include
#include
typedef std::wstring FileNameBasis;
typedef std::wstring FilePathBasis;
typedef boost::shared_ptr ImageBasis;
typedef Gdiplus::Matrix AffineTransformBasis;
typedef std::string StringBasis;
...

In general, the cross-platform code get passed these structures, and they pass them through unchanged except for memory management issues through to the platform specific code.

And by the way, whatever string class you use, make sure any string the user sees is created and stored as Unicode; it's 2007 people!

Mac Specific Code

Rendering was done with the Quartz API with an assist from ATSUI for text rendering. Layers were handled via CGLayerRef (which TrainPlayer later simulated to get OS X 10.3 support). Utility windows, dialogs and the inspector floater were straight Cocoa using Core Bindings to interface with my NSDocument derivative. This was my first major use of Core Bindings, and I was extremely disappointed in its stability; the biggest problem I had in this development was trying to get around odd crashes deep in Core Bindings.

PC Specific Code

As I said, I only created a proof of concept drawing list renderer for the PC using the GDI+ toolkit embedded in an MFC application. GDI+ can do much of what Quartz can, although I missed the ability to cast shadows from arbitrary objects, and I had to do extra work to get layers.

Conclusions

I'm thankful for the opportunity to try out my theories on cross-platform development on a live target. I'm most pleased to come up with a methodology which has the potential to share large amounts of code between platforms, while still allowing me to create a first class OS X application. I'm especially fond of the drawing list design, and find it a good way to factor drawing calls even if cross-platform development wasn't important.

If I had had more time to devote to the project, I'd have worked at integrating technologies like the platform specific undo managers into the design, but with the birth of my daughter I have zero excess development time. Things could always be better, but I gave TrainPlayer a good start.

Monday, June 04, 2007

The Golden Age of OS X Independent Software

I've found myself buying a lot of software for my MacBook these days. It isn't because I'm swimming in money, because I most certainly am not. It's because I have software needs and wants, and my peers are out there satisfying them. Here is what I've paid my own money for in the last few months:

  • Tables is a very satisfying spreadsheet which is nothing but a spreadsheet. It is very Mac-like, understated, stable and does what I expect of it. I chose it over Mesa, even though Mesa did charts because of its obvious emphasis on details. And now it does charts too, although they need a little work. I'd been using AppleWorks way past its expiration date.

  • YummyFTP is exactly what I was looking for in an FTP client. Believe me, I tried many of its competitors before settling on this classy little Cocoa gem. I needed a client which could deal with an incredibly unstable Chinese FTP site, and this was up to the job.

  • DVDPedia was recommended on the HT Guys podcast and I like it too. It has its share of interface issues, but a lot of craft has gone into it.

  • Remote Buddy I've mentioned before. I know from experience that input devices are cranky things, but Remote Buddy makes handling them seem effortless.

  • Super Duper!'s free features are so good, I haven't even paid for the premium features, but I probably will. After getting the heads up on this backup software from The Maccast I used it to transfer the contents of my hard drive before upgrading to a 200GB internal. Took a long while, but it was obviously very careful about protecting my data.


What do all this software have in common? Open their package contents in the Finder and you will see the unmistakable traces of Cocoa development. It's a lot easier today for one or two people to write an insanely polished application because we have Cocoa to handle the parts that every application does and we can concentrate on doing what makes our applications unique.