Wednesday, December 26, 2007

Cute Safari Tip: “Apple” to Apple Symbol 

The default Safari bookmark bar has a link to “Apple”, and another “Apple” labeled RSS feed of Mac news. You can save space by replacing the word “Apple” with the apple symbol: . (Anybody reading this on a non-Mac will likely see a question mark or an empty square instead of the symbol which is the apple with a bite out of it logo.) This symbol can be found in the Special Characters window as Unicode F8FF (decimal 63743), or generated by shift-option-K.

Other Mac specific symbols can be found at MacBiblioBlog.

[Updated: Had given the wrong key equivalent]

Wednesday, December 19, 2007

The Chaotic Nature Of Window's Menus

As part of my day job, I've recently become familiar with the creation, display, and destruction of menus on the Windows platform. In writing this entry, I hope not only to point out the esthetic and technical disaster which is the Win32 menu framework, but offer a caution against getting in the same sort of problems on the Mac.

From the users point of view, menus on Windows are a chaotic mix of styles. While, we Mac users are used to the menus of each application looking the same on any given OS version, Windows users are treated with menus of wildly varying style, depending on the artistic sensibility of the individual programmer. Here are just a few of the menus found on my wife's fairly vanilla Windows Vista install:

Note the random selection of:
  • Highlight colors
  • Separator graphics
  • Highlight region shapes
  • Margins
  • Margin separators
  • Root menu item shapes
  • A ball icon where another application would use a check mark
  • A "Ctrl-P instead of a "Ctrl+P"
  • Minimum spacings between the title and the accelerator
  • Sizing of the submenu triangle
  • Drawing modes for greying out icons
  • Degrees of fade for greyed out text
  • Alignments of the accelerator text

Contrast this with a selection of applications on Mac OS X Leopard:

Although even Apple can't protect us from the attack of the ugly cross-platform icons:

Why the disparity? Because when programming to the Win32 API, Microsoft requires all but the simplest menus to be drawn by application code responding to simple window events: WM_DrawItem, WM_MeasureItem. The system doesn't make it obvious where to draw your text, how to highlight selections, what the margins are, etc. The only thing the system will do for you is erase the background. If all you want to do is draw an icon: perhaps the favicon for a website, you have to manually draw the whole menu item, and if you want any hope of having that menu item look like all the other menu items in that menu, then you have to draw all the items in that menu, and if you do that, you will want to draw all the menu items in all the menus. If you code using the MFC framework, you will end up using some 3rd party menu class, written by somebody in the same boat as you, having to emulate and approximate whatever style is used by their favorite Microsoft app on any particular OS. Future changes in appearance will not magically appear with a new OS, but require each application to be reprogrammed, or at least to have a dynamic library updated.

Again, the major reason Windows programmers draw their own menus is to add an icon to a handful of menu items—which is usually the idea of some marketing guy anyway. If Microsoft had only added the ability to provide a graphic, few people would have bothered to draw their own. Life is short. If they had done so, they could have added all sorts of cool wizbangery to Vista, and we Mac users would be reduced to complaining how they stole translucent menus from Apple.

And while it may seem like a minor thing, Microsoft does not provide separate routines to specify the text of a menu and its key equivalent accelerator. Even if you aren't drawing a menu yourself, you have to provide a string like "&Redo\tShift+Ctrl+Z".

Hopefully, .NET programmers have an easier time.

The net effect of all this is that Microsoft cannot:
  • Upgrade the operating system and come up with a cool new appearance in which all the 3rd party applications (or even old Microsoft applications) magically share how their menus appear. Click on a menu and it's 1986 all over again.
  • Have any kind of unified appearance.
  • Replace those awful looking, English-centric, and space hogging accelerators with elegant Apple style meta key glyphs.
  • Seamlessly move to *resolution independent GUIs. Apple's having trouble getting 3rd party applications to be resolution independent, but the fact that the OS draws the menus at least eliminates that as a problem.

* Resolution independence is the idea that in our future of ultra-high resolution monitors, a menubar for instance won't be 25 pixels tall, but will be 25 points (25/72nds of an inch) with very high quality text.


While all this Microsoft bashing is fun, this is a morality tale of a sort. Apple in its wisdom has given Cocoa developers much more control over how menus are drawn in Leopard. You can in fact, draw anything you please in a menu now. Don't do it. It was bad enough when you could clutter up your menus with blocky little useless icons, now the user can innocently click on a menu and be confronted with the Gates of Hell. Don't do it. Resist the temptation. Put the NSView down. If you are porting a Windows app, do not let the pointy haired boss force you into porting the ugly menus. Tell him it can't be done. Tell him you would lose Tiger compatibility. Tell him anything, but don't mess with Mac menus.

"Just because we can do a thing does not mean we must do that thing." The Federation President in Star Trek: The Undiscovered Country

[Entry updated to point out more things wrong with the Windows menus and to add detail.]

Monday, December 03, 2007

Off Topic: The Perception of Time as I Age

2 observations:
1) The months pass faster than when I was younger.
2) I feel slower mentally. The ideas come at a markedly lower rate.

Is it possible that these two observations are sides of the same phenomenon? Perhaps my perception of time is based upon how fast I think. If my younger self had 1000 ideas a month, and my present self creates 500 ideas a month, then do I perceive 2 months as taking as long as 1 month used to take? It seems likely.

Thursday, November 22, 2007

There Must Be a Cocoa iTunes Coming + Vista 64-bit iTunes?

[Update: Apple released iTunes 8, and whatever it is, it is not pure Cocoa]

[Update: Apple did release a version of iTunes for 64 Bit Vista. I don't know how it matches up with the below speculation. ]

This is entirely speculation on my part. If I had insider information, I would not betray it. This is just my uninformed opinion.

Fact 1: iTunes as we know it is a Carbon application on the Mac. It after all is the direct descendent of SoundJam, a short-lived MP3 player for Classic Mac OS. Open up its package in the Finder and see things like a .rsrc file.

Fact 2: iTunes makes heavy use of QuickTime and the Mac Toolbox emulation layer which QuickTime for Windows contains. This is what allowed the same codebase to be deployed on both OS X and Windows relatively quickly. (OK, this may not be a fact, not having access to the iTunes code, but it seems darn likely.)

Fact 3: There is no 64-bit C API to QuickTime on the Mac. There is only a 64-bit Cocoa interface.

Fact 4: Pretty much all of that Mac Toolbox being emulated on Windows is no longer available under 64-bit compiles on the Mac.

Fact 5: All new Macs are 64-bit. Apple is in a good position to start promoting themselves as The 64-bit Company, as their path to 64 purity seems easier than Microsoft's, but it will take getting all their apps there first, and convincing their major developers to follow. I assume it was easy getting their Cocoa applications there, but any residual Carbon applications would be a nightmare.

Extrapolation 1: Even Apple will not be able to make iTunes a 64-bit application without a major rewrite. After you remove QuickTime, all the old Classic Mac APIs which are not in the 64-bit frameworks, etc., there is little left. Maybe much of the task specific onscreen drawing on the Mac is now done with Quartz calls, and the networking calls are probably being done with CFNetwork calls on the Mac, and maybe they are using WebKit to do the store, and that can all be salvaged, and hopefully the code is factored such that system calls are not sprinkled all over the place, but they are going to have to bite the bullet and just write a from scratch Cocoa application and call it iTunes 8.

Extrapolation 2: If Apple is not going to extend the life of Classic Mac APIs and QuickTime C APIs for 64-bit Mac applications, what are the odds of them doing so for the emulation layer on 64-bit Vista. Pretty low. There is no obvious quick way to get iTunes as we know it on Windows to run as a 64-bit process. I guess they could if they really wanted to, but it seems a pretty hacky solution and would basically obligate them to maintain an entire OS's SDK just to run iTunes and QuickTime.

Extrapolation 3: I assume Apple cares about the future of QuickTime and iTunes on 64-bit Windows. This is not a big market now, but will only get bigger, and Windows user are not going to accept running multimedia software in 32-bit emulation mode in perpetuity. They can do two things. They can create a new framework which can play and edit QuickTime content, perhaps a set of .NET classes, and write a separate version of iTunes which makes use of these classes. Or, they can bring Cocoa to 64-bit Windows, and run the same source based iTunes as the Mac. I suspect Apple doesn't want to do either, but iPod sales are important to them. They wouldn't want those early adopter 64-bit users running to Zune.

A little Googling on "64-bit iTunes" came up with this revealing alert text in the current Windows iTunes "This iPod cannot be used because the required software is not installed. Run the itens installer and remove itunes, then install the 64-bit version of iTunes." That makes me wonder if all this is going to fall out sooner rather than later. Or maybe it's an aberration.

Anyway, that's my uninformed opinion, and we'll see if it has any relationship to reality.

[Update: See this article about how Adobe is not releasing a 64-bit version of Photoshop until they can do the major re-writing required to remove their Carbon code.]

Wednesday, November 21, 2007

The Future or Lack of It for Carbon

If you are tasked with maintaining a large, old—is there any other kind—Carbon application, here's an illuminating experiment. In XCode 3.0, under Build architecture, turn on 64 bit compilation for Intel. Now try to compile.

In my case, I see errors and lots of them:

error: '::FlushVol' has not been declared
error: 'HideCursor' was not declared in this scope
error: 'ShowCursor' was not declared in this scope
error: 'LMGetFractEnable' was not declared in this scope
error: 'GetOutlinePreferred' was not declared in this scope
error: 'GetPreserveGlyph' was not declared in this scope
error: 'GetWindowEventTarget' was not declared in this scope
error: 'SetFractEnable' was not declared in this scope
error: 'SetOutlinePreferred' was not declared in this scope
error: 'MoveWindow' was not declared in this scope
error: 'SetOutlinePreferred' was not declared in this scope
error: 'OpenCPicture' was not declared in this scope
error: 'SetControlPopupMenuHandle' was not declared in this scope
error: '::SetControlMaximum' has not been declared
error: '::DrawThemeMenuItem' has not been declared
error: 'GetIntlResource' was not declared in this scope
error: 'CompareString' was not declared in this scope
error: '::SetMenuWidth' has not been declared
error: 'IsWindowVisible' was not declared in this scope
error: '::GetPicture' has not been declared
error: '::DrawPicture' has not been declared
error: '::EraseRect' has not been declared
error: '::FindDialogItem' has not been declared
.... lots more errors ...

So, we have a litany straight out of Inside Macintosh Vol. 1-3: all of QuickDraw, much of the File Manager, the Control Manager, the Dialog Manager, the Window Manager, the Menu Manager, the Font Manager all gone, but surprisingly not the Resource Manager. Anything that uses Pascal strings or FSSpecs. Few of the API's which came from the Classic Mac OS survive a 64 bit compile.

I don't know in the short term, what the carrot is for getting independent developers to make 64-bit compiles. Unless you are doing something that really needs 6 GB of real RAM—very few applications—there seems to be little performance advantage for 64-bit. And, it's hard taking a Carbon application, ripping out the APIs behind its entire GUI, and putting it back together again. Lots of redesign and drudgery for little immediate payoff. In the long term, Apple will presumably stop supporting 32-bit applications, but that is in the very long term.

Plus, there are little bits of functionality in the Carbon APIs which are hard to replicate in the more modern APIs. One example is embedding one's private metadata in the PICT clipboard flavor. I'd very much like to know how to do that with the PDF API. Another example is the XOR drawing mode for doing tracking, stupid and old fashioned, yes, hard to replace, yes.

There's an Ars Technica article which goes into this in more detail.

Monday, November 19, 2007

HDHomerun Gets More And More Useful

[Update: check out Signal GH, an iPhone utility for monitoring the signal quality of an HDHomerun].

One of the best bits of tech I've bought over the last year is the HDHomerun networked digital HDTV tuner. I was reminded of this a couple of weeks ago, when I was going through the process of setting up a Mac Mini as a dual boot Leopard/Vista desktop.

The OEM copy of Vista Home Premium I picked up cheap at MicroCenter comes with Windows Media Center, which Silicon Dust supports for use with the HDHomeRun, so for no added cost, I get live TV watching with a guide, and recording. This is functionality I would not easily get otherwise, as there is no MythTV client for Vista, and I am not going to get an additional tuner for the Mini. It just happened I get all this for free because I happen to own an HDHomerun.

Not that I'm very impressed with Windows Media Center, at least not on the Mini. HD playback is a bit stuttery, unlike VLC on the same OS and hardware; and navigation is confusing. Also, there doesn't appear to be an integrated solution for using my Wiimote; Remote Buddy on the Mac spoils you for effortless couch potato style navigation. MCE does allow the same application to control both live TV viewing and DVD playback, whereas I have to switch off between EyeTV and Front Row on the Mac, but that is comparatively easy with Remote Buddy.

Another HDHomerun nicety is that EyeTV can Picture In Picture both HDHomerun tuners (again on the same hardware Vista MCE can barely decode 1 stream), allowing me to watch 2 HD football games simultaneously, which will be my preferred mode going forward. I like to watch football live, so EyeTV wins over MythTV, whose live TV support is always a bit cumbersome, and the PinP feature seals the deal.

Anyway, my main point is to emphasize the value one gets from a networked device. Because the HDHomerun is accessible from any computer in my house, any computer capable of decrypting an HD stream can use it. In my case, I have 3 computers (my MacBook, a Mac Mini, and my Linux server) and 3 operating systems (Leopard, Vista, Linux), each with their own strengths, sharing this resource.

Wednesday, November 14, 2007

The Cover Flow Compulsion

Compulsion can be a silly thing. Ever since Cover Flow became ubiquitous on the Mac, I've been on a mission to stamp out generic covers. I spent 5 minutes last night looking for a MP3 track called Voyager "Motion" by someone named Sandra Collins which I'm not sure where it came from (did it come bundled with my Fischer Price iBook 6 years ago?), and which I have told iTunes never to play. That's just silly. But look at the pretty pictures swirl by....

Yes, Walt Mossberg, Cover Flow is addictive.

For tracks iTunes can't automagically find artwork for, Amazon is a gold mine of customer submitted artwork, at least for popular stuff; you can pick and choose your Beatle covers.

Then there's the DVD situation. MythTV wants your artwork in one folder where its database can keep track of it. Front Row 2.0 wants a preview.jpg file in the same folder with the VIDEO_TS. Vista Media Center Edition wants a folder.jpg in the same place.

My process was as follows. I copied the artwork from DVDpedia (which had downloaded them from Amazon), used Apple's Preview application to create a new file from the clipboard, and saved it as preview.jpg in every DVD folder on my server. Creating the folder.jpg file was a bit easier, as I wrote a script to make static links from the preview.jpg, the equivalent of typing
$ln -s preview.jpg folder.jpg
170 times. Try to let the computer do what computers do best. A lot of busy work, but look at the pretty pictures swirl by...

SVG on Webkit - Improving

I'm gratified to see WebKit's continuous improvements in its support for SVG. Recent builds fixed a bug, I and many others, reported involving non-support for superscripted and subscripted text. This might seem a minor thing, but SVG support means, for instance, that someday soon Chemistry students will be able to go to a structure on Wikipedia via their iPhone, Android phone, or iPod Touch and actually zoom in on any portion of it perfectly, without worry about pixelating some lame .png file, and without worrying about errors in rendering making the structure ambiguous.

XML based SVG might not be the hottest technology, but the need for an open standard, full featured, vectored image format is so great, it just keeps chugging along. And it's inclusion in the base technology of both the iPhone and the Google Android SDK means it will finally allow for the inclusion of resolution independent and small vectored files, where today monstrous bitmap files are used. I'm sure Google is eager for its use in web apps for such activities as creating charts and graphs.

Now if Apple would just turn it on for the iPhone.

Thursday, September 13, 2007

Walt Mossberg's Take on Linux Desktop Deficiencies

This is an e-mail I sent to Walt Mossberg, technology columnist for The Wall Street Journal after this morning's column on the practicality of Linux on the Desktop. In the column, Mr. Mossberg skewered Linux for playing neither MP3s or DVDs out of the box, and found this symptomatic of Ubuntu's incompatibility with the mainstream computer user.

    Mr. Mossberg,
    Just to let you know beforehand, I'm a Mac guy not a Linux evangelist, although I do maintain a Linux server in my basement.

    It seems to me that you should have mentioned the reason Linux distributions don't come with MP3 codecs or DVD players is because someone would have to pay for a license for those bits of software on a per user basis, and that gets in the way of distributing a free operating system from anonymous FTP sites. It isn't that the Ubuntu people haven't gotten around to adding that refinement, it's just that it's inherently incompatible with their distribution model. The MP3 codecs/DVD decryption software you end up downloading is either a clear copyright violation or outright illegal in the U.S. (DMCA) which makes it hard to get bundled with hardware.

    I hope you don't get an avalanche of e-mail saying you should use Ogg instead of MP3.


Mr. Mossberg quickly and politely replied, but since I didn't ask him for permission to repost, I will not publish his reply. As an aside, I've found him to be responsive to short, on point, polite e-mails; I don't know how he keeps up with it all.

Also, I might have been wrong, MP3 might be patent entangled not copyright entangled. Regardless, this is quite the problem for Linux distributions. While it's at least conceivable that some rich benefactor could get an unlimited license from Fraunhofer although Wikipedia says Fraunhofer earned 100,000,000 Euros in royalty payments in 2005 for MP3, but it seems unlikely that the DVD Forum could release an open source library and still maintain the pretense of DRM. And then there's the winner of the Bluray and HD-DVD war, and then whatever DRM hobbled downloadable format comes after that. Quite a problem.

It would seem that hardware vendors need a non-free version of Linux that they control the distribution of, which pays its per seat license fees and follows the DRM consortiums rules. Perhaps audio card manufacturers could market versions which are licensed to decode MP3 streams. I don't know. I do think that Walt Mossberg is right and people have some expectation to play standard media on their desktop computer (without breaking the law).

Wednesday, September 12, 2007

Apple's Cool New Site Search

I don't know when this was released, but the search field on is slick. It seems every day some hotshot AJAX programmer (or WebObjects guru) comes up with some new magic once reserved for desktop applications, not web page frontends. In this case, Apple's search function does a real time database search and populates a dynamic popup menu with images and text, all of which updates after every keystroke.

Type in "boo" and you get this:

Add a "t" and you get this:

Quick, slick and cool.

Tuesday, August 21, 2007

Recording Standard Definition TV from DirecTV with MythTV

In a followup to my note to myself on playing live TV from my satellite box using mplayer, this is about integrating a pcHDTV 3000's s-Video input into my MythTV system.

It would, of course, be nice if I could get all the programming I want in unencrypted HD, but sometimes all you can get or afford is old fashioned analog. In my case, I have a D12-300 DirecTV satellite receiver on my desk, this extra receiver costs me $5 a month over the cost for the receiver in the TV room, so it's a fairly good deal. What makes it a better deal is using my MythTV to time shift, skip commercials and watch it anywhere in the house wirelessly onto my MacBook.


Most of the hardware I already had. I did purchase the USB to serial adaptor—according to accounts on the web not all adaptors work with DirectTV receivers—and the null modem adaptor.

Hardware setup

I plugged the USB end of the USB to serial adaptor into the DirecTV receiver, and the serial end into the null modem adaptor. I plugged the other end of the null modem into the serial port of my Dell Dimension—tighten those screws. I plugged the s-Video cable into the s-Video output of the satellite receiver, and into the s-Video input of the pcHDTV card. I plugged the RCA jacks of the audio cable into the red/white outputs of the satellite receiver and the mini-jack end into the line in port (2nd from the right) of the AV-710.

Software setup

My previous posting described setting up the system audio to activate the line in port for audio input. I would follow those instructions now, to confirm you can see and hear live TV via mplayer.

As I will be using a script to change channels over the serial port, I want a non-root user account to open up the serial port device, so as root from the command line:
# chmod a+rw /dev/ttyS0

I would test this by running your copy of the script. For example:
$ /usr/share/mythdora/ 231
should bring up the Food network. (Your path will be different, probably.)

I went to my account on (the soon to be discontinued) zap2It labs and setup a listing grabber for DirecTV, limiting it to the 10 or so channels I would conceivably watch.

Then I ran mythtv-setup.
  • Capture Cards
    • Card Type: Analog V4L capture card
    • Video device: /dev/video
    • VBI device: /dev/vbi
    • Audio device: /dev/dsp
    • Audio sampling rate limit: 48000
    • Do not adjust volume: Checked
    • Default input: S-Video
  • Video Sources
    • Video source name: DirectTV
    • Rest of the settings are dependent on your having setup up a listing grabber for DirecTV
  • Input connections - [V4L :/dev/video] (S-Video) -> DirectTV
    • Display Name (optional): DirecTV
    • Video source: DirecTV
    • External channel change command: /usr/share/mythdora/ (your path will be different)
    • Preset tuner to channel: 231
    • Click on the Fetch channels from listing source
    • Starting channel: 231
    • Input priority: -1

Of course, I kept my pcHDTV's DVB setup to record over the air digital HD programming, although I did change the Recording Options for the Capture Card to only "Open DVB card on demand".
[Update: somewhere along the way, I lost ATSC tuning for the pcHDTV card, and I'm having a hard time getting it back. For me, this is not an urgent matter as I have the two tuners in my HDHomerun, but obviously it would be important to anybody with only a pcHDTV. I'll try to figure this out, but I think it has to do with a conflict between my kernel version and the DVB drivers.] [Update 2: I updated mythtv to the current build to support Schedules Direct and it works again. I apparently can now record off of one of the pcHDTV's outputs at a time (not both at once.)]

At this point, you should run mythfilldatabase.

I then restarted the mythbackend, and opened up the mythfrontend, but before watching any TV, I had to setup my recording profiles.
  • Utilities/Setup
    • Setup
      • TV Settings
        • Recording Profiles
          • Software Encoders (v4l based)
            • Default
              • Enable auto-transcode after recording: Checked
              • Width: 720
              • Height: 480
              • Codec: MPEG-4
              • Codec: MP3
              • Sampling rate: 48000
            • Live TV
              • Enable auto-transcode after recording: Checked
              • Width: 720
              • Height: 480
              • Codec: RTjpeg
              • Codec: Uncompressed
              • Sampling rate: 48000


After finishing all this, I was very surprised that I could watch Live TV both at my MythTV box itself, and over wireless with my MacBook, and I could also schedule recordings and it would all work. I like having a wired connection to the satellite box for changing channels, it should be 100% reliable unlike an infrared blaster arrangement. The video is amazingly ugly compared to the beautiful HD I get from my antenna, but then again, it's going to be a while before Good Eats comes in over the air.

I keep my audio sampling, storing and playback rates at a constant 48000 Hz as it is the standard playback rate for the PCM stereo audio which will eventually go out the optical port to my audio receiver, and I don't see the point of adding another possible failure point in the sampling conversion. My previous MythTV install made 44.1Khz recordings sound high pitched.

I'm just glad I'm done with this, and I hope this saves you some time.

Friday, August 17, 2007

Playing Live TV from Satellite box on Linux

This is more of a note to myself, as I'm sure this involves an unusual set of hardware and software.

I have a standard definition DirectTV satellite box on my desk. I wanted to display its content on my Mythdora Linux box—actually, I want to record standard definition programming with MythTV, but I haven't quite gotten there yet. [Update: This post is about recording]. I have a pcHDTV 3000 capture card in my Linux computer with an s-Video input, and a Chaintech AV-710 audio card with a line level input (it's the second one from the right). I figured I could bring the video in through the s-Video port and the audio in through the line level port via an s-Video cable and an RCA to mini-jack cable, it would just take a little configuration tweaking.

First, I had to use alsamixer to turn on capturing, so I brought up the capture display for the AV-710 by using alsamixer -V Capture and selecting the Line capture item and hitting the space bar and then setting the Capture level item to a moderate value. (Actually, I did a ton of stuff trying to make this work, but I think it boiled down to this.)

Then I had to figure out the voodoo to cause mplayer to display the video from the pcHDTV card combined with the audio from the AV-710 card. This ended up being:

mplayer -tv driver=v4l2:device=/dev/video:input=2:outfmt=rgb24:alsa:amode=2:audiorate=48000:forceaudio:immediatemode=0 -vo xv tv:// -vf pp=lb -ao oss:/dev/adsp

Breaking this command line into parts:
  • -tv driver=v4l2:device=/dev/video:input=2 means the Video For Linux framework will grab video from input 2 of the pcHDTV card
  • alsa:amode=2:audiorate=48000:forceaudio:immediatemode=0 means audio will come from the ALSA frameworks default capture device at a sampling rate of 48000 Hz (which is the frequency that my TOSLink output needs)
  • -vo xv tv:// set the video output
  • -vf pp=lb means to use a linear blend deinterlace algorithm
  • -ao oss:/dev/adsp means to output it to what happens to be the optical output of the AV-710

Sunday, August 12, 2007

Mac MythTV Frontend Settings

There are a lot of MythTV settings. Page after page of preferences. In many ways, this is bad design; a refusal by the authors to make decisions about correct and incorrect behavior. But that's the situation, and here are a few settings where I think the value should change.

Utilities/Setup.Setup.General.Audio.Enable AC3 to SPDIF passthrough
If you have your Mac hooked up to a receiver via an optical cable, remember to check this or you will throw away your surround sound. Unfortunately, it doesn't auto-detect the absence of such a cable and fall back to non-passthrough, so you'll end up unchecking this when you move your Mac.
If you are using a widescreen Mac, or attaching your Mac to an HDTV, you should try MythCenter-wide as the theme. Non-wide screen themes tend to get their lower buttons cut off in the more complicated preference panels.
Utilities/Setup.Setup.Apearence.Theme.QT Style
The Mac themed buttons are unusable (transparent blue on blue). Use Windows style.
Utilities/Setup.Setup.TV.Playback.General Playback.Deinterlace playback
Unless you have a Mac which can just barely display 1080i content, or your monitor will deinterlace content for you, you should turn this setting on.
Utilities/Setup.Setup.TV.Playback.General Playback.Algorithm
This is something you should investigate for yourself. Run the Activity Monitor application in your /Applications/Utilities folder to determine the performance hit of the various deinterlace methods, and test them for effectiveness. In particular, the BOB method doesn't work well for me, while the Kernel method works but uses more of my CPU. I will try using the One field method.
Utilities/Setup.Setup.TV.Playback.On Screen Display.OSD Theme
The Gray-OSD theme seems the the most tasteful of the bunch.
Utilities/Setup.Setup.TV.Playback.On Screen Display.Always use Browse mode when changing channels
If your channels take as long to change as mine, you'll prefer to see what's on before committing to changing the channel.
Utilities/Setup.Setup.TV.Playback.Mac OS X Video Settings 2/2.Video in the dock
I think this feature is pretty useless and a waste of power.

Sunday, August 05, 2007

Why Can I use MythTV with 802.11g, but not EyeTV?

The Linux box in the basement runs the MythTV backend. I can watch live HDTV wirelessly using 802.11g networking through the Linksys router in my back bedroom to my MacBook. As long as the microwave isn't on, and I stay within 30 feet or so of the router, the picture and sound are typically great.

I have an HDHomeRun networked HDTV tuner right next to the Linksys router, and I cannot watch 5 seconds of TV wirelessly without a skip, big pixilation, or pop using the EyeTV application. Using wired ethernet it's fine, but 802.11g? Forget about it.

I can set the MythTV to grab a stream from the HDHomerun, bring it down to the MythTV via wired ethernet, and then back up the same wire to the router, and out as radio waves, and it still works smoothly. I just watched 20 minutes of 1080i HD golf on CBS without a skip; although twisting the MacBook around will cause the occasional skip at this distance—perhaps 25 feet through a floor and a couple walls.

Why can MythTV do what EyeTV can't in this situation? I don't know; presumably it has more flexibility in buffering at the source, but it still has to average the same bit rate. MythTV might be better at handling lost packets... Again, I don't know. I just know MythTV works better for wireless HDTV.

Thursday, August 02, 2007

New Windows Remote Desktop Beta

The Mac Business Unit at Microsoft released the first beta of Remote Desktop Connection 2. This is good news for me. A perk of my job is that I get to do most of my C++ coding on my Mac, but I do need a PC on my desk for Visual Studio to implement and debug Windows specific objects. As I require cutting and pasting between the two environments, I use Remote Desktop Connection on my iMac to put a full screen PC on a secondary monitor. I can copy and paste text between the two environments, using the same trackball and keyboard; and I can do full compile link cycles while not taking away from my Mac's performance unlike a Parallels virtual machine.

Unfortunately, I was irked by inexplicable slowdowns and unreliable text pasting; and I had no real hope of the situation brightening as I believed Microsoft had abandoned Remote Desktop Connection development, and I would be stuck forever with a PowerPC only, buggy hack. And yet, here out of Redmond comes a beta of a universal binary version 2. Read the blog entry announcing the release. They took use cases like mine into account; I like the easy sharing of a Mac folder as a Windows volume; text pasting seems instantaneous, and GUI performance is fairly snappy on my 20" Intel iMac.

It shows its beta-ness. I crashed it several times trying to bring it up on my second monitor before it stuck. I hope the MacBU will fix the more blatant bugs in short order. Thanks MS.

Wednesday, July 25, 2007

HDHomeRun and EyeTV

[Update: check out Signal GH, an iPhone utility for monitoring the signal quality of an HDHomerun].
Cruising the AVS Forum Mac Home Theatre Forum I was gratified to see El Gato and Silicon Dust had come to together and done the obvious: bundle El Gato's best on platform EyeTV DVR software with Silicon Dust's networked HDHomerun high definition digital tuner.

This only makes sense, as El Gato no longer makes the closest approximation to the HDHomerun, the long lamented EyeTV 500 Firewire tuner with its support for digital over the air broadcasts and unencrypted cable streams. The HDHomerun can do both functions only better:
  • Two independent tuners
  • Shareable between computers
  • Placeable closer to roof antennas
  • Uses ubiquitous Ethernet port instead of specialized Firewire

If I were putting together a home theatre PC, the HDHomerun is the only ATSC/QAM tuner I would consider; it's just so much more flexible than something tied to a single computer.

Plus, using EyeTV with my HDHomerun cost nothing extra. I already owned a copy of EyeTV 2 and El Gato tech support gave me a link to the 2.4.2 update for free. New purchasers of a HDHomerun can get an EyeTV bundle for $200, meaning El Gato is charging $30 for a two seat license above the $170 price for just the HDHomerun.

The first problem was figuring out I had to run the Setup Assistant from the Help menu so EyeTV could find the HDHomerun; which it quickly did; and it immediately came up streaming 2 digital channels in separate windows. Then I had it search for available channels which took over an hour—812 frequencies @ 5 seconds each—using the exhaustive option. It found 15 digital sub-channels in my area, which is a bit low; but it is the height of summer and tree leaves down the street block my path to the Boston antenna farm. A later quick search found 12 channels, so the exhaustive search may be worth it. It would be nice if it would just take the information from my TitanTV account. Maybe El Gato could speed this up by using both tuners. A second exhaustive scan the next morning found 19 sub-channels.

At this point, I could watch TV, in fact, I could watch 2 separate TV streams at once on my MacBook, with sound from the frontmost stream. Not that I would make a habit of doing so; the combined effort of decoding 2 1080i streams into half sized windows takes 175% of a core, leaving a measly 25% to do anything else. Also, I was reminded of EyeTV's annoying habit of resizing the window every time a standard definition commercial comes on. Remember to fix the ratio at 16:9.

As I can watch HD from my MythTV via my home's 802.11g network, I was hoping to watch live TV direct from the HDHomerun over wireless, but this brought sputtering, stopping, and ugliness; El Gato should improve EyeTV's behavior over an unreliable network. Still, I have the whole house wired with Cat-6 Ethernet cable, so I have some flexibility.

EyeTV allows you to update the firmware to the HDHomerun quite easily. Much more easily than manually downloading the firmware and flashing the device with the HDHomerun's Windows utility. This is typical of EyeTV's nearly painless experience. EyeTV is the best I've seen at live TV viewing, and I've tried MythTV, VLC and SageTV (on the PC). I haven't tried it's DVR functionality, because I actually use my MacBook, and can't devote it to the task; if and when I get a Mac Mini for the TV room, I will give it a try. I just wish there was something to watch in the summer; I've 3 tuners in the house and nothing to see.

Friday, July 20, 2007

upgrading to Mythdora from Fedora Core 4

My Linux installation was not aging gracefully. I had installed Fedora Core 4 a couple years ago, and had intermittently added packages as needed to keep current compiled versions of MythTV running; if just barely. For instance, the DVD player didn't know what to do with DTS tracks, there was no overlay interface on playback (although this might be because I had a bad OSD theme selected). I wanted to try the new tickless kernel to see if it helped with using less energy and making less noise, but nobody is keeping the yum database up to date for Core 4 and a yum updage did nothing. So time to upgrade.

The Myth(TV)ology page recommends installing the Mythdora specialized distribution. This is Fedora Core 6 with pre-installed MythTV all on a DVD image. Also, I wanted to take the opportunity to bump up my boot disks capacity a bit, and also not destroy my initial installation, so I ordered a 400 GB from Frys.

Before I did anything, I backed up my MythTV database via:
$ mysqldump -u mythtv -pmythtv mythconverg -c > mythtv_backup.sql

I have a fairly uncommon configuration in that my master PATA (hda) drive is a Windows XP boot volume, while the slave (hdb) is my Linux disk. I replaced my hdb drive with the new 400GB drive, and booted off the Mythdora DVD. Now, I had to be very careful not to wipe out my XP installation; choosing to install on the hdb drive, and using the boot installer advanced options to install a boot installer on hdb. Experience told me that installing a boot installer on the XP disk would be bad. Then after installing Mythdora, I had to boot off a Knoppix LiveCD, and in the terminal make a copy of the first 512 bytes of the hdb1 volume:
dd if=/dev/hdb1 of=mydora_bs.bin bs=512 count=1
I mailed the resulting file to myself via a webmail account. I booted into Windows XP and followed the instructions for editing the boot.ini file to make my Linux installation an option for the Windows boot manager.

At this point, I could boot into Linux. I put my old Linux boot hard drive into a Firewire case, and attached it the computer, allowing me to recover the old database via:
$mysql -u mythtv -pmythtv mythconverg < mythtv_backup.sql
and to copy the contents of my recordings folder, the contents of my .mythtv folder, and anything else onto the new disk. My old recordings had been in a /video volume, but mythdora had created a /storage volume, so I created a symbolic link to point /video into the new partition.

After doing this, the overlay display disappeared, which caused me to discover that I had been using an invalid OSD theme (who names these things?) which came back after setting it to a valid theme, it took me a while to find the proper audio device settings to allow TOSLink pass through on my Chaintech AV-710 audio card (it's /dev/adsp), and of course there is always the nonsense in dealing with my complicated xorg.conf file.

And now I've upgraded, and everything seems to be working OK. I can play DVDs with DTS tracks. I have my overlay display. I'm getting used to Gnome instead of KDE.

Oh, and the energy use has not improved. If anything, it's using a few more Watts. Still worth the upgrade.

Wednesday, July 04, 2007

How To: Fix Portrait Oriented Movies

If you are like me, you have dozens of video clips from digital "still" cameras which are rotated 90°. And like me, you find them hard to watch in their original form. Luckily, this is easy to fix with QuickTime Pro.

Open a .mov or .avi file from your camera using the QuickTime Player which you have upgraded to Pro:

Under the Window menu is a Show Window Properties item. A dialog pops up and you should see a list of tracks. Choose the Video Track, and then click on the Visual Settings tab.

Quicktime Visual Settings
Click on one of the circular arrows. Your movie will rotate 90° and you can now save it in a form suitable for friends and family.

I don't know why iPhoto doesn't do this. It should.

Monday, June 25, 2007

Widescreen Still Cameras

For Christmas, I bought my wife a Panasonic DMC-LX2K digital still camera. This camera has many interesting features—including a Leica lens—but one pushed me towards the purchase: it's native widescreen sensor. Unlike most cameras with a 16:9 setting, this camera does not throw away pixels when you shoot with it. Also, the camera records Quicktime movies at greater than DVD resolution (848x480). In good light you would not believe the quality of video you get out of this "still" camera.

In bad light, on the other hand, you would not believe how quickly the quality drops to atrocious. I tell everyone that this is the world's greatest pocket camera except for the lousy—low light—pictures.

But this entry is not about any given camera, but about the benefits of a wider aspect ratio in digital photography and video. If we watch our media content on 16:9 HDTV screens, and the 16:10 aspect ratio is becoming commonplace for computer monitors, why does 4:3 have a near monopoly on still photography? And will this change?

This topic was brought to mind as I was using Front Row on my MacBook to present a series of recent snapshots on my 720P HDTV. All the landscape shots fit perfectly on the screen, showing the content at its best and retaining the original composition of the images.

Photos with different aspect ratios, especially the extreme 9:16 rotation came up with some very odd, and pixelated cropping.

Or this picture of my mother at the beach in the more common 3:4 portrait image:

But even standard 4:3 images can come up missing important details like me peering over the top of this menu:

It is an implementation detail of iPhoto/Front Row that full screen images are cropped to fit (this is for straight viewing of the library, and not slideshows where you have more control). Apple could just has well have pillar boxed the images, and they probably should have even if this removed detail. This does not negate the fact that the best aspect ratio for photos which will be displayed on an HDTV is 16:9, and 16:9 is a good compromise for full screen display on the 16:10 aspect ratio of newer Apple monitors. It's certainly better than 4:3. And 16:9 is better for the same reason it and wider formats are used in movies, it's easier to tell stories with pictures if you have room to work with; not having your subjects jammed together in unnatural intimacy. People occupy space.

Why is 4:3 still the standard for digital stills if it is no longer the standard for digital video? I had thought the reason was the dominance of 4:3 friendly printing media in that you can't go to Staples and get widescreen photo paper, but the most common photo size is 6:4 which has an aspect ratio of 1.5 halfway between 16:9's 1.8 and 4:3's 1.33. Anyway, most digital photos are only seen onscreen and screens are getting wider; all of Apple's monitors and laptops use 16:10 displays. The iPhone doesn't follow this trend with a 3:2 aspect ratio; probably a compromise between esthetics and comfortably fitting in the hand.

I suppose manufacturers feel the buying public isn't ready for the change. They've shot 4:3 since the days of glass negative plates and probably don't feel the need to change. Also, I would think (but don't know) that the optics have to be more sophisticated to project such an oblong image without chromic and other distortions. But the public will change. They will tire of the whole shoddy experience of looking at their hi-tech photos on their hi-tech TV and grab onto solutions.

Consumers should buy 16:9 equipment when available and compose and crop their shots in future proof 16:9. Imagine 20 years from now how quaint all those 4:3 pictures you took last month will look. Anybody will tell at a glance they were from a bygone era—of tail fins, rotary phones, and poodle skirts.

Thursday, June 21, 2007

Alton Brown Uses My Software

My TiVo has a Good Eats season pass, and you can bet there is one thing that will get me to pause, rewind, slo-mo and call the wife in from the other room: when he brings out his LX-Proscope and starts looking at the fine details of yeast, sugar, or most recently pretzel salt. Why? Because I wrote the Luxus software clearly displayed on his PowerBook. It was my first project using the Qt framework, and it turned out pretty good for what it was; I just wish they let had let me make a Cocoa version. Somebody told me they've also seen it being used on CSI. How cool is that?

I bring this up because I was having a hard time finding Mr. Brown's e-mail address to correct his description of what constitutes an acidic solution. His chemistry knowledge is usually good, but not this week.

Wednesday, June 20, 2007

Which Mac OS X should I Support?

A critical decision any developer of Mac OS X software has to make is where to draw the line between users we can be troubled to support, and which users we can tell to come back when they have a newer operating system. It comes down to number of users, and effort to support. First, the numbers. According to Steve Job's recent WWDC keynote there are 15 million Tiger users, 5 million Panther users and 2 million users of older operating systems.

This is for the overall Mac community; the market segment you intend to develop for may be different. Maybe you are targeting the elementary school market where 5 year purchasing cycles create huge pockets of happy 10.2.8 users (I guess). Or maybe you are targeting the top end photo editing business where the release of universal binary Photoshop has released a maelstrom of hardware and software purchasing the like not seen since the world was young (another guess). Temper these absolute numbers knowing somebody who hasn't shelled out money in 6 years to buy a new OS is unlikely to buy third party apps. Regardless, these are the numbers we have and what we'll use.

Apple itself is always happy to tell its developers to forget about users of older releases--not in so many words. If these users find they cannot run some new application they will gradually feel the pressure to upgrade, and hopefully upgrade to new hardware, but even selling a $129 OS upgrade to some iBook G4 user is nearly pure profit to Apple. Also, Apple wants us to integrate new OS features: hard to do while still holding onto 10.2.8 support.

There are over a million people out there running 10.2.8 and some small number running 10.1, and some unlucky few running 10.0 who couldn't be bothered to get the free 10.1 upgrade. Should developers support them? No.

  • You want to use invasive technologies like Core Bindings which are 10.3 and above.
  • You won't be able to debug and still use the latest developer tools.
  • You are going to find that these people are 5% of your user base but manifest 50% of your bugs.
  • Your graphics are going to look awful, trust me.

Realistically, only a product with a near 100% installed base and a huge development team like Acrobat Reader would even think about supporting 10.2, and look even Adobe's minimum OS is 10.4.3.

OK, well what about 10.3? Those 5 million users are awfully tempting. Yes they are. And I'd still lean against supporting them with any project you are starting today. Why?
  • Debugging, while possible, is hard. Maybe you are the master of command line gdb or remote debugging, but I am not.
  • By the time you get finished October will be here and those 10.3 users will be squished into a smaller slice by Leopard.
  • There are many 10.4 specific technologies in Cocoa and Core Foundation that I'd like to use.
  • A one or two person team only has so much time, and there are a lot of better places to put your effort.

So my advice is to target 10.4, and be thankful you never had to support Windows ME. The horror, the horror.

Friday, June 15, 2007

iTunes Plus - It Doesn't Nauseate Me

I opened up iTunes yesterday and navigated to the iTunes Plus page. I was offered with the opportunity to upgrade my 21 EMI tracks to the new format for $6.30 ($0.30 a piece). I accepted the offer. I'd like all the music companies to let their music use this format; and this is how we consumers vote, with money. Not that Apple's DRM has hurt me. About once a year, my iPod forgets it's authorized to play my tracks, requiring a sync; but otherwise it's been transparent. What I like is the knowledge that my music is now truly my music, and the quality.

I know some people believe an ordinary person can't tell 128 kbps from 256 kbps AAC files, and this may be literally true when comparing 10 second clips in one and 10 seconds in another. The difference is in the long term.

Backing up. In one of my previous incarnations, I helped write the ill fated OS X client for MusicMatch. Ill fated because Apple came out with iTunes and it's hard to compete with free.

People at MusicMatch were used as convenient guinea pigs to determine just how much you could compress various styles of music before you started to hear the compression; the more compression the more MusicMatch could save on bandwidth costs. Too much compression, and people would hate it. The goal was to deliver good sound, but not wastefully good sound. We'd put on big noise isolating head phones and listen to several clips at various compression levels, plus an uncompressed version, straining to hear the difference. Over and over for an hour. I became reasonably good at catching the differences, picking out the instruments especially susceptible to distortion, those that rattled, scratched or were high pitched went first; the fine details lost; things like bass guitars not so much.

And sometimes I couldn't tell the difference. But one thing I can tell you about overly compressed music, even that which I couldn't discern from the original: it made me sick. Not throwing up in the waste basket sick, but mildly nauseated or at least a general feeling of non-wellness. At the time, I thought it was just from having to listen to the same Backstreet Boys riff too many times, but I have a new theory.

Music is compressible because the human mind can be fooled. The AAC and MP3 codecs use psychoacoustic tricks which make the mind think it's hearing what it isn't. The actual sound waves don't look much, if anything, like the original; the details are faux. For me, this trickery disorients and tires my brain, and the more compressed the more intense the feeling. The less compression, the less trickery, and the more wholesome the music. Not that 128 kbps AAC is as bad as 96 kbps MP3, I'm just saying I can barely feel the unnaturalness of it. Similarly, I hate listening to stereo music being manipulated by the Dolby Pro Logic filter through my SphereX surround speakers, a few seconds of that and I'll be diving for the remotes 2.1 button and its refreshing clean sound.

I'm listening to some Norah Jones in iTunes Plus format now. Her piano sounds rich and beautiful; I love the sound of a grand piano. I feel good. The music is making me feel quite mellow. I can't tell that the harmonics of the acoustic guitar in my left ear are more realistic or the tambourine being slapped in my right ear are less distorted, but my sense of well being does. So, anyway, I paid my $6.30.

[Update: I suppose I should point out that I'm not claiming any expertise in audio compression or the psychology of sound perception. I'm an application programmer. No sensible person would hire me to write an audio codec.]

Sunday, June 10, 2007

AAAAAARG - The Sound of a Corrupted Videoserver

So, I rebooted my MythTV into Windows Friday morning. The system asked me if I wanted to setup the drivers for the new Rosewill SATA card I was using to access the 500GB video server drive. This seemed reasonable enough.

Bad move.

Somehow, Windows managed to severely damage the ext3 partition on the drive over the course of setting up the driver. Lost everything that was on it, much with no backup.

So reformat. Reboot into Windows to see if it would be gentler now that the drivers are setup. Yes. Start over again.

While I was at it, install the ext2fs modules for Windows. For the 30 seconds I was testing it, seemed to work fine showing my Linux partitions on the Windows desktop.

Thursday, June 07, 2007

Port Model Train Software to OS X - Under the Hood

This is the second of two posts on my experience helping to port Train Player to Mac OS X. This is about my theories and designs for using OS X application technologies in a project which shares code with a Windows C++ application. This is not a code level look at such things, but a top level view. Also, while the framework was designed from the start to be cross-platform sharing code between Mac and PC, TrainPlayer for the PC is still using its old codebase, and may never make use of the techniques described below.

Why Is Cross-Platform Development Hard?

It may be a myth that the Russians chose a different railroad gauge to prevent invasion, but it is not a myth that commercial OS companies avoid open standard development APIs to keep their developers their developers. Neither Microsoft nor Apple cares much for Java, with Microsoft going so far as to invent a language, C# which looks like someone took a thesaurus to Java. The careful software engineer can still write the bulk of an application in vanilla C++ with wrappers to native GUI and application frameworks. But you have to be very careful.

Rejected Paths

You can write C++ applications in the Qt framework and have it run on Windows, Mac OS X, and Linux. I've written three such applications and every day use such Qt applications as MythFrontend and a Perforce client. My problem with Qt is that I don't believe it possible to write a first class OS X application in Qt, you can make a serviceable Windows application in Qt, but then again Windows users are not known for their esthetic sense. Also, there are licensing fees for commercial development, and Qt does a lot of weird things with C++ to allow the visual design of interfaces, and as a side effect locking you into Qt.

We could have written the whole application in Cocoa with no re-use of the Windows code, using just the document schema. But, that's an awful lot of wisdom to throw away.

The Path We Took

What is more valuable, the code you write, or the code you get for free with a framework? The framework code, because you won't have to throw money and time at maintenance. Do not replicate what are given for free on a platform for the sake of sharing cross-platform code, even if it means replicating some bit of functionality on the other platforms, or horrors having the Mac version have a different behavior than the PC version. I can't tell you how many times I've heard the same excuse about how the manual will be confusing if they aren't exactly the same. Get used to it, Mac applications are similar to PC applications, but they are different and Mac users demand their differences. And, few read the manual.

For example, most applications have simple menu needs. You could setup all the menu items in TrainPlayer in Apple's Interface Builder in an hour. You could add a menu item in 4 minutes. It is not worth anybody's time putting together some elaborate framework of XML descriptions files so that you don't have to edit the menus in both Interface Builder and Visual Studio. Yes, occasionally you will forget to add a menu item to the other platforms; better that then maintaining 500+ lines of code for cross-platform menus. Such code would be especially problematic in Cocoa where many menu items are loosely wired to the first NSResponder instead of a document or an application classes. On the other hand, if your menu needs are complex--perhaps you have a dozen different versions of your application, all with different menu subsets--then its time to write yourself an XML schema and go to town. Thankfully, such was not the case here.

Of course, Cocoa makes it so easy to do just about anything in the GUI that every time this comes up you end up just whipping up a little objective C class to provide data to: the car collections dialog, the pre-defined layouts browser. If any of these did anything truly complicated, the proscribed action would be to create a cross-platform model class, but anytime the Objective-C to C++ glue code is nearly the same size as the actual C++ code, it's time to just keep that platform specific.

Which brings me to what, if anything, is cross platform here? Anything that involves manipulation of the document class and its data structures is cross platform. We created a subclass of NSDocument and had it host an instance of the C++ object which contains the actual document data structures. The NSDocument subclass does things like pass in menu commands and connecting the cross platform code to the main view in the document window.

Cross-platform GUI

Any view which is composed of non-standard content, i.e. isn't composed of buttons, text, sliders, etc, but is something special to this application is created by adding a NSView we called a LayeredView and giving it a C++ based DrawingSource. This would include the main document view, the clock, the control panel, the side view of trains in the control panel and in the toolbar, and the switch editing panel. All these views, and only one NSView class. All this rendered content, and the logic for it is entirely cross platform. On the PC, I've written a corresponding MFC class which provides the same function as a proof of concept.

The LayeredView object for each platform provides:
  • Hosting for an arbitrary number of layers.
  • Capture of mouse and keyboard events passed to the DrawingSource
  • Drawing each layer when requested by the source.
  • Popping up contextual menus
  • Zoom level support

A DrawingSource object provides:
  • A list of drawing primitives (more later) for each layer
  • An affine transformation for each layer
  • Mouse tracking
  • Key press handling
  • Determination of which contextual menu to show.
  • Idle handling

Because the LayeredView on the Mac is an Objective-C object, the DrawingSource does not talk directly to the LayeredView but forwards requests for such services as layer invalidation, size changes, and screen coordinate conversions through a C++ DrawingDelegate object which the LayeredView installs in the DrawingSource when they are initialized together. In a mixed language setup, you will have to create these small interface objects anytime cross-platform C++ has to interface with some other language.

Drawing Lists

The DrawingSource provides a DrawingList which is just an STL vector of primitive graphic operations on primitive graphic structures. For example, adding a point to the current path is an operation, stroking the current path is an operation, drawing a bitmap at a given point with a given rotation inside a given box is an operation, etc. Lists have a number of advantages over the alternative of providing a wrapper API--providing a whole slew of drawing methods like FrameRect(), FillRect(), FrameOval(), FillOval(), FrameBezier(), FillBezier(), DrawText(), ClipPath()...--as graphic toolkits often don't map well to other toolkits. The PC TrainPlayer, for instance, makes heavy use of GDI Brush objects, to which a Cocoa programmer might say "What that?" and munge together some state object which approximates a Brush. With a DrawingList you are just creating a list of platform neutral instructions for drawing something, you don't have to worry about passing through a platform appropriate drawing context, you don't have to worry about flushing too often or not often enough. There will be a single routine implemented which knows all about drawing on the current platform and it will be optimized for rapid drawing. On Mac using the Quartz API, on the PC using GDI+. You could easily imagine alternative renderers based upon OpenGL or any other modern toolkit.

DrawingLists are conducive to layered drawing. Just keep a separate list for each layer. If nothing changes in that layer--maybe it's the layer of station switches in TrainPlayer and the current update involves the cars moving--then no need to recalculate it, just redraw it.

Another nice thing about DrawingLists, which I didn't take advantage of here because of all the legacy code, is that they can be generated on a separate thread. On multi-core machines (i.e.) all new machines, this can be a big win, especially if determining what to draw is computationally intensive. And each layer could have its own thread. Graphic toolkits tend to require drawing be done in the main thread, making any preparatory step which can be done in child threads helpful.

Data Types

I prefer using cross-platform data structures based upon STL or Boost, but sometimes this is not practical. Necessity and performance needs sometimes force the use of platform specific structures. You don't want to spend your rendering time converting raw bitmaps into platform native images. Therefore, I defined a set of macros:
#if _MAC
typedef CGImageRef ImageBasis;
typedef CFStringRef StringBasis;
typedef std::string FileNameBasis; // treate as UTF-8
typedef CFURLRef FilePathBasis;
typedef CGAffineTransform AffineTransformBasis;


#define STRING_RETAIN(nativeString) if(nativeString != 0) ::CFRetain(nativeString);
#define STRING_RELEASE(nativeString) if(nativeString != 0) ::CFRelease(nativeString);
#define STRING_LENGTH(nativeString) (nativeString == 0)?0: ::CFStringGetLength(nativeString)
#else // not _MAC
typedef std::wstring FileNameBasis;
typedef std::wstring FilePathBasis;
typedef boost::shared_ptr ImageBasis;
typedef Gdiplus::Matrix AffineTransformBasis;
typedef std::string StringBasis;

In general, the cross-platform code get passed these structures, and they pass them through unchanged except for memory management issues through to the platform specific code.

And by the way, whatever string class you use, make sure any string the user sees is created and stored as Unicode; it's 2007 people!

Mac Specific Code

Rendering was done with the Quartz API with an assist from ATSUI for text rendering. Layers were handled via CGLayerRef (which TrainPlayer later simulated to get OS X 10.3 support). Utility windows, dialogs and the inspector floater were straight Cocoa using Core Bindings to interface with my NSDocument derivative. This was my first major use of Core Bindings, and I was extremely disappointed in its stability; the biggest problem I had in this development was trying to get around odd crashes deep in Core Bindings.

PC Specific Code

As I said, I only created a proof of concept drawing list renderer for the PC using the GDI+ toolkit embedded in an MFC application. GDI+ can do much of what Quartz can, although I missed the ability to cast shadows from arbitrary objects, and I had to do extra work to get layers.


I'm thankful for the opportunity to try out my theories on cross-platform development on a live target. I'm most pleased to come up with a methodology which has the potential to share large amounts of code between platforms, while still allowing me to create a first class OS X application. I'm especially fond of the drawing list design, and find it a good way to factor drawing calls even if cross-platform development wasn't important.

If I had had more time to devote to the project, I'd have worked at integrating technologies like the platform specific undo managers into the design, but with the birth of my daughter I have zero excess development time. Things could always be better, but I gave TrainPlayer a good start.

Monday, June 04, 2007

The Golden Age of OS X Independent Software

I've found myself buying a lot of software for my MacBook these days. It isn't because I'm swimming in money, because I most certainly am not. It's because I have software needs and wants, and my peers are out there satisfying them. Here is what I've paid my own money for in the last few months:

  • Tables is a very satisfying spreadsheet which is nothing but a spreadsheet. It is very Mac-like, understated, stable and does what I expect of it. I chose it over Mesa, even though Mesa did charts because of its obvious emphasis on details. And now it does charts too, although they need a little work. I'd been using AppleWorks way past its expiration date.

  • YummyFTP is exactly what I was looking for in an FTP client. Believe me, I tried many of its competitors before settling on this classy little Cocoa gem. I needed a client which could deal with an incredibly unstable Chinese FTP site, and this was up to the job.

  • DVDPedia was recommended on the HT Guys podcast and I like it too. It has its share of interface issues, but a lot of craft has gone into it.

  • Remote Buddy I've mentioned before. I know from experience that input devices are cranky things, but Remote Buddy makes handling them seem effortless.

  • Super Duper!'s free features are so good, I haven't even paid for the premium features, but I probably will. After getting the heads up on this backup software from The Maccast I used it to transfer the contents of my hard drive before upgrading to a 200GB internal. Took a long while, but it was obviously very careful about protecting my data.

What do all this software have in common? Open their package contents in the Finder and you will see the unmistakable traces of Cocoa development. It's a lot easier today for one or two people to write an insanely polished application because we have Cocoa to handle the parts that every application does and we can concentrate on doing what makes our applications unique.

Sunday, May 27, 2007

My Experience At the Cambridge Galleria Apple Store

I took my MacBook to the Genius Bar at the Cambridge (Mass.) Galleria Apple Store--making sure to make a reservation. The MacBook had been spontaneously shutting down, not sleeping, in low battery conditions, and I wanted to see if the Genius would authorize a warranty replacement of the battery. There was some urgency as my battery cycle count (as shown in the Power section of the System Profiler) was at 288, and my understanding is they will not replace a battery with a cycle count over 299.

I arrived at the designated time--6:15pm on Friday, and my name was already 4th in the queue over the bar. I explained my situation to the genius, he looked at my battery's stats and agreed it needed replacing. He asked me if I needed anything else and I mentioned the palm rest recall; my palm rests had big brownish smudges and he agreed they should be replaced free of charge. Took the 'Book into the back room and twenty minutes later I had a new battery and a pristine palm rest. Very satisfying.

I've had other times when a Genius wouldn't replace a battery--not his fault, the cycle counts rule--so it was nice to walk in and get satisfaction. Also, I don't know if Apple's made any money off of this MacBook. It was bought at educational pricing, had to have its heatsink replaced, and now the free battery and top cover.

Friday, May 18, 2007

When to Buy A Big Video Server Hard Drive

On February 9th of this year, a Seagate Barracuda 750GB internal hard drive cost $300 shipped according to DealMac. By May 18th, that number had dropped to $200. (A 500GB drive dropped from $144 to $110 over the same period, while a 250GB drive kept at a constant $60.)

So, the price of a 750GB drive dropped by about $33 a month over that span. If I had purchased a drive back in February, I'd now be wondering if I had gotten $100 worth of convenience out of it. I'm guessing not. On the other hand, it would seem silly to await the end stages of obsolescence the 250GB drives now occupy, my son is steadily making coasters, after all. With 1TB drives now available, there should be even more downward pressure on the 750GB, perhaps in a month or so it will be down to $170 and the depreciation will approach a more manageable $15 per month; or maybe it will accelerate, nobody knows, we can only guess.

Why don't I buy two 500GB drives, and get an even bang for the buck ($.22/GB versus $.27/GB)? Four reasons:
  • I'd have to put those drives in either 1 expensive case or 2 cheap cases, ruining the price advantage.
  • If I put them in one case, it would presumably be in a RAID-0 configuration with an increase in probability of catastrophic loss. If I put them in 2 cases, it'd make their use just that more complicated.
  • I'd have to provide power for the extra hardware, and at a couple dollars/month extra over the lifetime of the drive, it adds up.
  • More drives equal more heat and fan noise.

Therefore, I'd like to keep with one big drive in a Firewire case—simple, comparatively safe, and fairly energy efficient. I can backup everything I foresee needing on 750GB, and I'm going to say my trigger point is $175. I can attach it to the MythTV in the basement making it available to every computer in the house. Beyond the foreseeable, I'm sure there will be inexpensive 1.5TB drives in a couple years and I can do an upgrade then.

[UPDATE: a 500 GB drive dropped to $100 shipped over the weekend. I'm now thinking that by adding an internal SATA port on a PCI card, I could install such a drive inside my Linux desktop, avoiding the need for an external enclosure, and yet another "box with green LEDs" on my desk. I paid for the SATA power connector on my power supply, and it's about time I used it. This would be $100 cheaper, and I can supplement the storage space with unused bytes from the other two hard drives in the computer. The Windows partition, in particular, has a lot of free space.]

[UPDATE 2: I could not deny the sweet spot of the buying curve, so I bought a 500 GB Maxtor Ultra16 SATA drive ($100) along with a Rosewill PCI to SATA adaptor ($19). I installed it in an open drive bay in the basement MythTV box, where the combined card/drive combination added 12W to my energy usage when in use and 2W when in low power mode. The drive formatted to 459 GB (480,719,056 KB) using the ext3 file system standard to Linux, so plan accordingly.

It was a mistake to get that particular SATA card, as it only has one internal connector. I'll have to replace it when I max out my drive bays. On the other hand, it popped in and worked under Linux with absolutely no extra effort.

There is one open drive bay left, and in a few months I can buy another 500 GB at a reduced price, finishing off my storage needs, and making my basement uninhabitable with all the noise. In the meantime, I can drive DVDPedia with my Wiimote, and have my favorite content available in a few clicks. Now, if only I could afford a Mac Mini...]

Sunday, April 29, 2007

Porting Model Railroading Software to OS X From Windows

Last week, TrainPlayer International released a Mac OS X version of Train Player, their application for designing model railroads. I had something to do with this. I setup the general structure of the application at the start of the porting project, implemented a number of features, and worked into the debugging stage. But my real job has been nothing but debugging for the last two months, and I have nothing left to give to side projects, so they've had to get along without me.

This blog entry is the first of two on the project. This will cover GUI differences between a Windows application and a Mac OS X application, in terms of user expectations. The second will be on designing an application under the hood to share code between a Cocoa application and an old MFC application.

What Mac Users Want

The first thing I noticed about the interface was the surfeit of toolbars. With buttons for nearly every action.

When I worked in Illinois, we called this "Marketing Mode." Compare this with the left

and the right side of Apple's Pages word processor.

Apple Pittsburgh spent untold hours categorizing the actions required to lay out text, and distilled them into six icons. I can imagine the pressure they felt to add a special button here or there for someone's pet function. "What about a spell check button?" But it is important that toolbar icons be few and large in order that mouse acquisition times be short, requiring thoughtful selection of buttons. Notice there is plenty of room under the large icons for actual descriptive words; sometimes a word is worth a thousand pictures.

The second thing I noted was how often the user was required to open a modal dialog box, sometimes two, or even three nested modal dialogs to configure some user interface element. For example, to change the labels for a series of railway cars you would select a car, choose "Properties..." from the "Train" menu, to bring up this modal dialog:

which you would modify, hit "OK" and repeat for every car in the series. Modal dialogs are great for programmers; work flow is linear and controlled. If the user hits "Cancel" you just don't change your data structures, while if they hit "OK" nobody expects you to allow them to undo it; you've already given them a chance to back out. It's just so easy. But, Mac users these days are used to more.

The more they expect is a modeless, floating inspector window.

This allows the user to select, change and rapidly select and change again, all without losing focus on the document. It should allow for undoing changes (although apparently that functionality hasn't been added yet to Train Player, but should). The user should feel more comfortable and able to work in a more free flowing manner.

The Windows version used GDI graphics for its rendering needs, resulting in pixelated track paths like this:

I felt OS X's Quartz graphics environment would create much cleaner lines if we treated tracks as continuous paths instead of discrete segments and the Windows programmer was willing to make the changes needed to get nice anti-aliased and beautifully joined paths like these:

Similar effects could be achieved under Windows with GDI+, but I felt the typical Mac user would immediately see what was missing and demand higher quality. Quartz also allows any object to cast a shadow, making this effect: extremely easy to produce. Simple as it is, this translucent shadow makes the cars pop out and resulting in a more realistic look.

Along the same vein, Windows users see this clock:

while Mac users will see this clock with a composited reflective crystal and hands throwing shadows:

The Windows version had a modal dialog to choose individual cars from collections. It had two modes: an icon view and a table view, with each collection selectable via a tab.

I felt there was no need for two modes as the Mac table class was capable of displaying the larger image; and I am leery of using tabs to represent categories, especially when the user could install or create their own. Therefore, I replaced the tabs with a single column table as you might see in an iLife application, and I merged the two modes into a single table grid.

In other areas, I left it up to the experts at TrainPlayer, they know model railroading, whereas I only know generic applications. So when they say this is how you build up a set of tracks, this is how you build a set of tracks:

And this is how you run a simulation on a completed railroad:

I wish YouTube didn't add an ugly filter to its videos. You have to see how pretty it is.

What Was Achieved

In many ways, I'm proud of the result so far, and I expect the application to go through a rapid process of refinement now that it is available to a wider Mac user audience. Areas of the GUI where I varied it from the Windows original are definite improvements, some of which could be back ported to give the Windows user base a taste of Mac refinement.

In other places, I'm not thrilled. After I left the project, too many buttons crept into the GUI, and text fields really should support undo. Today's standards are high, it's hard to keep up, especially while reusing large chunks of C++ code, and bitmap images from a Windows application. The good thing is there is always version 1.0.1.

Porting is a balancing act. You've got to know what code and what GUI elements can be reused, and which must be re-implemented in order to avoid the dreaded "Looks like a Windows port" review. I think we've avoided that, and that's an accomplishment in itself.

Thursday, April 26, 2007

Wii Remote + MythFrontend + Remote Buddy + OS X

In most aspects of my life, I have decent impulse control. But when I discovered IOSpirit's Remote Buddy software allowed me to use a Wii Remote Controller to control my Mac, I just had to have one. One-click buying on Amazon, right now, need to have!

I don't even own a Wii; I've never even seen a Wii; I don't know how to pronounce Wii.

I downloaded Remote Buddy, and in 30 seconds determined it was worth 10 Euros. I am impressed by the degree of finish this little product shows. As an application developer I can tell when somebody put their love and creative soul into a product, and Remote Buddy shows this. And IOSpirit is actively improving Remote Buddy too, just last night they released an update which made a marked improvement in the interface and, apparently, in the reliability of button pushing.

When the Wii Remote Controller arrived from Amazon, the longest step in the setup was inserting the batteries. Remote Buddy paired quickly with the remote and I was soon driving my Mac from across the room.

IOSpirit has full support for all the buttons on the remote except the power button, which I guess must control an infrared emitter instead of using Bluetooth, nor does it have any apparent tie into the accelerometer's inside the remote, or the speaker [Update: After looking over the support forums at IOSpirit, there is support for virtual mousing with the accelerometer, but it involves providing a strong infrared beacon (a candle) to give the infrared sensor on the remote an extra frame of reference, and is therefore probably not very practical] [Update 2: I built a 3 Infrared LED based emitter using the circuit generated by this online calculator This is a lot more convenient than mousing via cursor keys, but is a bit wobbly especially when double clicking]. The way it controls Front Row and iTunes is extremely slick, with lots of use of the Heads Up Display (HUD) windows which are so popular these days.

There was no pre-set for the MythFrontend application, so I ran the "Behavior Construction Kit" to create an identity for MythFrontend, and then went to the Behaviors preference to configure the buttons. Note that some buttons can have separate actions assigned to when you briefly press the button, and also when you press and hold the button. For other buttons, Remote Buddy requires you to press and hold the button, presumably to avoid inadvertent presses with one's palm.

Button Action
Up Cursor key up
Down Cursor key down
Left Cursor key left
Right Cursor key right
A / Play/Pause "P"
A / Play/Pause (held) Spacebar
B / Menu Escape
B / Menu (held) Open Remote Buddy menu
Home Escape

The trigger shaped "B" key on the facing side of the remote was the most complex to figure out. In normal practice, its used by Remote Buddy to bring up its HUD menu, but I liked its behavior in Front Row where clicking the "B" key goes one menu layer up, while clicking the round "A" key digs one menu layer down. Also in Front Row, clicking and holding the "B" key brings up the Remote Buddy HUD, and this seems a good compromise.

For simple watching of TV and recorded programs, you can drive MythFrontend with 7 buttons, which is a little unfortunate because the Wii has 6 buttons in comfortable reach of the thumb and forefinger. The space key which is used to press some buttons, such as the quit confirmation, and to bookmark show positions is infrequently enough used that I assigned it to holding down the "+" button. I also assigned space to the hold state of the "A" button, but time will tell if I get into the habit of using it. [Update: Time has told, and I use the held down A button.] I could assign the other buttons ("-", "home", "1", "2") to handle less frequent commands, such as zooming, or bringing up the program guide, but I've never gotten into the habit of using those MythTV features. For me 7 buttons are sufficient. I do wish MythTV allowed the right cursor button to push any selected onscreen button. The few that act differently stick out and annoy.

For pure driving of the interface, the Wii remote is excellent. The remote has a heft I miss with the Apple remote, and fits comfortably in either hand with the forefinger and thumb resting next to their buttons ready to spring into action, rescuing me from commercials. Being wireless, there is no need to point, and button pushes are reliable, with little need to look at the remote during use. I haven't felt the need to use the attached strap, my TV watching is not quite that kinetic.

Remote Buddy's menu system makes it easy to switch between media applications or control iTunes in the background while having a more reliable Bluetooth remote makes driving Front Row more pleasurable.

The downside of the remote is it's being radio frequency based, and thus can't be used to drive my TV or stereo receiver. I still need my Harmony 550 to control those, and since I can use the 550 to drive my MacBook anyway, by pretending to be an Apple remote, the question then becomes is it worthwhile to have a separate Wii remote just to control the MacBook. That's a tough question. Certainly the integrated solution is simpler and cheaper, on the other hand, the Harmony is a complicated, general purpose remote and is not optimal for driving a 12 foot interface. You spend a lot of time staring at the Harmony, finding the button you've mapped to mimic the Apple Remote's "Menu" button, and the cursor buttons have nowhere near the smooth action of the Wii remote.

When the Wii loses touch with the Mac (sleep, leaving its range, baby takes the batteries out), you will have to re-pair it, which can quickly be done by pressing the 1 and 2 keys simultaneously. Remote Buddy will re-detect it within seconds.

In summary, using a Wii remote to drive the MythFrontend, and other multimedia applications is fun and convenient. I'm looking forward to the day when I can put a Mac Mini in my TV room and get down to doing some serious viewing.
[Update: I notice that Remote Buddy will go into a mode where it uses a lot of CPU cycles (10% of a core) waiting to pair with a missing Wii remote. I'm assuming this is something IOSpirit can fix.]
[Edited for grammar, and inserted photo since first published.]

[UPDATE: I spent a happy weekend with my MacBook hooked up to the HDTV in the TV room. I mounted a SMB share from the Linux box in the basement with a bunch of video content selected from DVDPedia, drive the Apple DVD Player, as well as watching TV via the Myth Frontend, listening to iTunes, watching movie trailers in Front Row. I even lazily browsed the Internet using the remote as a mouse. It was fun--and yes, I should get out more. The only problem with the remote itself was Myth Frontend would not get the pause command the first time so I will end up clicking it 3 times to pause.

The RemoteBuddy DVDPedia setup did not include the command key for launching linked files (Command-L), so I mapped it to the home button on the Wiimote. Annoyingly, the DVD Player had to be quit before moving to a different movie, as the DVD Player / DVDPedia interaction would go into some sort of loop preventing playback.]

Wednesday, April 11, 2007

Streaming video to VLC from MythWeb

After building and installing the latest MythTV version, I browsed through the mini-website, MythWeb, which gets installed and served by Apache. There is a page for all my previously recorded programming which comes up as a table:

It's a nice refinement from previous versions, but what caught my eye was this newly added icon:

Mousing over it gives a tooltip: "ASX Stream" and clicking on it causes a file to be downloaded with the double extension ".asx.asf". What is that? By default, on OS X, Quicktime was set to open .asf files, but Quicktime didn't know what to do with it. So, I reassigned the file type to the VLC player. Now opening the file (which you can do by right clicking on Safari's download window) launches VLC and starts playing the recorded program. No more relying on the, somewhat clunky, MythTV frontend for displaying recorded content. Instead, you can use the much more refined, and Mac-like VLC player in either windowed or full screen mode.

This is not perfect. VLC can't fast forward, jump forward, or scrub through the stream. So the MythTV frontend is still more functional, unless you want to watch commercials. OS X gives 5 applications which are registered to handle the .asf file type: VLC, MPlayer OSX, MPEG StreamClip, Windows Media Player, and WMV Player. Only VLC can actually use the file as generated by MythWeb.

I don't know what is missing from the stream (or VLC) but I wish somebody would add arbitrary scrubbing. It would be very cool to use VLC for my TV viewing. And it'd also be nice if MythWeb added a TV tuning page, and live TV streaming. And it would be even nicer if it the tuning page handed off my HDHomeRun's network stream directly to VLC instead of acting as a proxy.

Still, a really cool feature.

[Update: I couldn't actually watch a whole program without losing the stream, but it'd be cool if it worked.]

Wednesday, March 21, 2007

DVD Main Title Size: Where the Gigabytes Go

For my personal edification, I've been surveying my DVD collection in terms of how much each movie uses on disk. This is the main feature only, none of the extra fluff, but it does include wasted space taken up by language tracks which I do not speak. This is movies only, not TV shows.

So far I've found 187 movie DVDs around the house. There are discs my son has reduced to coaster status, and I've no idea where ⅔rds of the Indiana Jones trilogy ended up. I feel that is enough to get some feel for what the average movie uses on disk and how it correlates, if at all, with quality.

So some stats:
  • Disks: 187 disks
  • Average size of Main Feature: 5.10GB
  • Largest Main Feature: 12.94GB over 2 Disks: The Return of the King
  • Largest Main Feature on a Single Disk: 7.60GB King Kong
  • Smallest Main Feature: 3.08GB Sleepless in Seattle (which had plenty of room for both Thai and Korean subtitles)
  • Total GB: 948

Some surprises:

I have the Superbit version of Panic Room, the cheapest Superbit title I could find. Sony claims that Superbit is great because they fill up the whole disk with movie, making it as uncompressed as possible while allowing for both a Dolby Digital and a DTS soundtrack. Well, Panic Room clocks in at a measly 5.66 GB, leaving nearly 2 GB of room which could have been used for less compression. Having said that, the picture quality is quite good as it is. My other Superbit title, Adaptation, leaves around half a gig of empty space on the disk. Is it false advertising if you claim the highest quality DVD, and it turns out you could have made it even better by using less compression?

And how did Pirates of the Caribbean end up with only 3.61 GB? It was a special effects blockbuster with a huge marketing budget. They couldn't spend some tweak time getting a better picture?

Breakfast At Tiffanies is just the opposite, despite being quite pedestrian in its appearance, having only mono sound, and being none too long at 115 minutes manages to use up 7.18 GB. Where does it all go?

The disks that have the most movie on them tend to be major blockbusters: The Lord of the Rings, Minority Report, etc. where a lot of tweaking time can be spent getting everything to fit, or are just plain long (Patton). Disks with the lowest space usage tend to be catalog romantic comedies originally released pre-DVD. The studios didn't spend much time getting the highest quality disk, perhaps in the hope of coming back in a few years with a "Widescreen Collector's Edition", such as was the case with The Big Lebowski (3.54 GB), L.A. Story (3.57 GB), and Sleepless in Seattle (3.08 GB). By way of comparison, a short exercise DVD "Post Natal Yoga" clocks in at 3.94 GB.

[UPDATE: Here's a histogram of number of disks per size. The cut off for a single-sided, single layer disk, 4.7 GB minus a few trailers, is pretty obvious. I'm a bit surprised by the shape of the curve; if I had had to guess beforehand, I would have thought that there would have been two peaks, one just below the maximum capacity of a single layer, and another just below the maximum for a dual layer. But once you get passed the minimum for a single layer, it's almost as if it's following a random uniform distribution.]
[UPDATE: Updated for more disks surveyed.]