Wednesday, December 26, 2007

Cute Safari Tip: “Apple” to Apple Symbol 

The default Safari bookmark bar has a link to “Apple”, and another “Apple” labeled RSS feed of Mac news. You can save space by replacing the word “Apple” with the apple symbol: . (Anybody reading this on a non-Mac will likely see a question mark or an empty square instead of the symbol which is the apple with a bite out of it logo.) This symbol can be found in the Special Characters window as Unicode F8FF (decimal 63743), or generated by shift-option-K.




Other Mac specific symbols can be found at MacBiblioBlog.

[Updated: Had given the wrong key equivalent]

Wednesday, December 19, 2007

The Chaotic Nature Of Window's Menus

As part of my day job, I've recently become familiar with the creation, display, and destruction of menus on the Windows platform. In writing this entry, I hope not only to point out the esthetic and technical disaster which is the Win32 menu framework, but offer a caution against getting in the same sort of problems on the Mac.


From the users point of view, menus on Windows are a chaotic mix of styles. While, we Mac users are used to the menus of each application looking the same on any given OS version, Windows users are treated with menus of wildly varying style, depending on the artistic sensibility of the individual programmer. Here are just a few of the menus found on my wife's fairly vanilla Windows Vista install:










Note the random selection of:
  • Highlight colors
  • Separator graphics
  • Highlight region shapes
  • Margins
  • Margin separators
  • Root menu item shapes
  • A ball icon where another application would use a check mark
  • A "Ctrl-P instead of a "Ctrl+P"
  • Minimum spacings between the title and the accelerator
  • Sizing of the submenu triangle
  • Drawing modes for greying out icons
  • Degrees of fade for greyed out text
  • Alignments of the accelerator text


Contrast this with a selection of applications on Mac OS X Leopard:






Although even Apple can't protect us from the attack of the ugly cross-platform icons:



Why the disparity? Because when programming to the Win32 API, Microsoft requires all but the simplest menus to be drawn by application code responding to simple window events: WM_DrawItem, WM_MeasureItem. The system doesn't make it obvious where to draw your text, how to highlight selections, what the margins are, etc. The only thing the system will do for you is erase the background. If all you want to do is draw an icon: perhaps the favicon for a website, you have to manually draw the whole menu item, and if you want any hope of having that menu item look like all the other menu items in that menu, then you have to draw all the items in that menu, and if you do that, you will want to draw all the menu items in all the menus. If you code using the MFC framework, you will end up using some 3rd party menu class, written by somebody in the same boat as you, having to emulate and approximate whatever style is used by their favorite Microsoft app on any particular OS. Future changes in appearance will not magically appear with a new OS, but require each application to be reprogrammed, or at least to have a dynamic library updated.


Again, the major reason Windows programmers draw their own menus is to add an icon to a handful of menu items—which is usually the idea of some marketing guy anyway. If Microsoft had only added the ability to provide a graphic, few people would have bothered to draw their own. Life is short. If they had done so, they could have added all sorts of cool wizbangery to Vista, and we Mac users would be reduced to complaining how they stole translucent menus from Apple.

And while it may seem like a minor thing, Microsoft does not provide separate routines to specify the text of a menu and its key equivalent accelerator. Even if you aren't drawing a menu yourself, you have to provide a string like "&Redo\tShift+Ctrl+Z".



Hopefully, .NET programmers have an easier time.


The net effect of all this is that Microsoft cannot:
  • Upgrade the operating system and come up with a cool new appearance in which all the 3rd party applications (or even old Microsoft applications) magically share how their menus appear. Click on a menu and it's 1986 all over again.
  • Have any kind of unified appearance.
  • Replace those awful looking, English-centric, and space hogging accelerators with elegant Apple style meta key glyphs.
  • Seamlessly move to *resolution independent GUIs. Apple's having trouble getting 3rd party applications to be resolution independent, but the fact that the OS draws the menus at least eliminates that as a problem.



* Resolution independence is the idea that in our future of ultra-high resolution monitors, a menubar for instance won't be 25 pixels tall, but will be 25 points (25/72nds of an inch) with very high quality text.

But.

While all this Microsoft bashing is fun, this is a morality tale of a sort. Apple in its wisdom has given Cocoa developers much more control over how menus are drawn in Leopard. You can in fact, draw anything you please in a menu now. Don't do it. It was bad enough when you could clutter up your menus with blocky little useless icons, now the user can innocently click on a menu and be confronted with the Gates of Hell. Don't do it. Resist the temptation. Put the NSView down. If you are porting a Windows app, do not let the pointy haired boss force you into porting the ugly menus. Tell him it can't be done. Tell him you would lose Tiger compatibility. Tell him anything, but don't mess with Mac menus.

"Just because we can do a thing does not mean we must do that thing." The Federation President in Star Trek: The Undiscovered Country


[Entry updated to point out more things wrong with the Windows menus and to add detail.]

Monday, December 03, 2007

Off Topic: The Perception of Time as I Age

2 observations:
1) The months pass faster than when I was younger.
2) I feel slower mentally. The ideas come at a markedly lower rate.

Is it possible that these two observations are sides of the same phenomenon? Perhaps my perception of time is based upon how fast I think. If my younger self had 1000 ideas a month, and my present self creates 500 ideas a month, then do I perceive 2 months as taking as long as 1 month used to take? It seems likely.