Sunday, December 14, 2008

How I Drive Around With My iPhone

If you insert your iPhone's headset into your iPhone, and then plug an iPhone to USB connector into the bottom connector, the phone has a very useful set of behaviors. Regular audio such as music or podcasts will go out the bottom connector—which in my case ends up coming out my car's speakers—while phone conversations will go through the headsets. Also, the click button on the microphone works to pause (single click), answer (single click), hang up (single click) and skip to next track (double click).


Therefore, when I get into the car for my commute home, I plug my headset into the phone, put the right—microphone—ear bud into my ear, and then plug the phone into my car's stereo system. The order this is done is important. Then, as I'm driving around, if someone calls me, I can answer the phone, hang up, and skip tracks all without looking. Be warned, might be illegal in your locality.

Thursday, November 27, 2008

A Tech Support Mystery

Last week, my co-worker Gerald came to my cube saying his PowerMac G5 was consistently locking up after starting up and bringing up Entourage. He had come to me as I am the Mac programmer on staff, and therefore considered to know all things Macintosh.

I went down to his cube to find the G5 unresponsive to mouse clicks; locked up just has he had said. Very odd. After a few seconds of futzing around I realized that the machine did respond to right clicks. And a few seconds later, Gerald mentioned in passing that every time he restarted, the optical drive ejected. I immediately knew what the problem was. Can you solve this mystery? Answer below.
















The left mouse button of his Logitech mouse was stuck in the clicked state. Thus the machine was non-responsive to mouse clicks, and as everyone should know, holding down the mouse button while restarting will eject the optical drive of a Mac. I went and got a Mighty Mouse, plugged it in and the machine worked perfectly.

Friday, November 14, 2008

Patience Needed for Switching .mpg Application

My MythTV records over the air broadcasts as .mpg (MPEG) files of several gigabytes each. My preferred mode of viewing them these days is to open up a SMB share on the MythTV box, copy the file over to my MacBook while I'm working and then view it later. Amongst the many annoying things about this process—inefficient SMB transfers, avoiding having Time Machine back up the show, OS X not letting me watch a partially transferred file—a small one is that Quicktime owns the .mpg extension, when it'd be nice if VLC did. But every time I would open a Get Info window on a .mpg file the whole Finder would lockup, apparently generating the preview. And then I'd relaunch the Finder after a minute or so. So I never was able to click the Change All... button.

Turns out that the Finder will eventually, after several minutes, give up with creating a preview, and you can change the Open With application to VLC and then click the Change All... button. So be patient.

Friday, November 07, 2008

Sleep Advice for Geeky Fathers

As a father of young children, I have some advice for my fellow software engineers in similar boats who want time for their side projects. At some point, if you are lucky, your children will be sleeping from their bedtime of say 9:00 PM to maybe 7:00 AM depending on their natural needs. My advice is to work within this framework when it comes to your own sleep schedule.


Every day, come home from work, play with your children till their bedtime, fall asleep as they do and wake up naturally. For me, falling asleep at 9:00 PM will mean waking up in a quiet, empty house at 4:30 AM with 2½ hours of time to work on my software projects and catch up on my TV watching before the children awake. I will spend the rest of the day fully rested, and at my peak performance. Compare with the alternative strategy of staying up after the babies' bedtime which if I "get on a roll" might keep me up till 12:30 AM, with no margin for error. What if the babies wake up at 6:00 AM as they sometimes do? I'll spend the rest of the day under a sleep debt: low performing and grumpy.


Also, my recommended strategy allows you to pay off your sleep debt. If you do this and wake up later than expected the first few days, you had an accumulated sleep debt you needed to pay off before reaching peak efficiency.


I'm basing this advice on this Google Tech Talk which I highly recommend watching:

Thursday, November 06, 2008

When Is A Leak Not A Leak

I was sent a bug report on some software I had written a while back wherein it would become less and less responsive over time till it just sort of died. Sounds like a leak.


But when I hooked it up to the Leaks Instrument, I saw no leaks. No red peaks at all. Must not be a leak then.


But one more look. The net objects allocated were going up and up and up. This should not happen, the application should have reached an equilibrium point of net objects allocated, and oscillated around it.


So, I let it run for a while in the Objects Allocated instrument, and then examined the blocks which were being allocated and not deallocated. Turns out I had an object, an audio player, which had a scheduled NSTimer associated with it. When I released the player, it was still being referred to by the NSTimer, so it did not get deallocated. And in this case, I was creating one of these every second or so (probably not a good idea), which meant that not only was the memory leaking, but all those timers were adding overhead to the event loop. No wonder it crawled to a halt. So remember to invalidate unwanted timers.

Wednesday, October 29, 2008

Debugging a Shower

The big extravagance at the house this year is replacing the cheapest looking vinyl shower you ever saw with a very nice tile and glass block version. A friend of my wife's built it over the course of a month; it's beautiful. The babies have been learning the joys of showers over baths, and I've been spending much of my time pondering development problems from the built in granite seat.


Unfortunately, it's started leaking and in tracking down the leak, I've realized how broadly applicable the skills we pick up debugging software can be applied to the physical world.


Initial Bug Report.
After a lengthy bout of baby cleaning, my wife noticed a wet spot growing from the hallway wall shared by the new shower. She reported the bug to me as a high priority.
Observation of Bug's Behavior
The bug only manifested itself after a shower. And a shower of the babies.
Initial Conclusions
If the copper pipes inside the shower were leaking, then using the shower would be irrelevant; it would leak regardless of use. The only exception to this would be a leak in the pipe from the valve to the shower head. So no need to rip out the walls to get to the water pipes.
Experimentation
Working on the assumption that if the shower was leaking, it was doing so near the wet spot, and most likely at the most complicated part of the shower: the granite seat. So, I took my morning shower with the spray pointed away from the seat. Result: No wet spot. Conclusion, the seat was not properly sealed. There was a seam through which water could get through and into the outside world. This meshed well with the initial bug report, as the babies would tend to stand on the seat and have the spray directed at them there for long periods.
Rejected Alternative Hypothesis
Someone suggested that perhaps the flow was overwhelming the drain and a path to the outside was being reached from the bottom. This seems unlikely as the flow into the drain should be the same or less when the spray was pointed towards the seat. And 15 minutes of water directed directly at the drain disproved it.

Final Conclusions
Have to reseal all the joints, and re-grout the tile beneath the seat till we find the bad seam.

Saturday, October 25, 2008

Visual Studio 2005 C++ and the ? : operator

One thing you want in C++ compilers is consistency with other C++ compilers. Especially if you are doing cross-platform development work. I want the VS C++ compiler to interpret what I tell it to do exactly the same way that gcc interprets it. That way, I don't have to special case my compilations, and I don't create a bunch of platform specific bugs in my platform neutral code. And usually, the two compilers do agree.


However, I discovered today (Friday) that if you compile something like:
double x = (false) ? 0 : 1.4;
that x will not equal 1.4 as most people (and the gcc compiler) would think, but rather it will equal 1.0. Why? Because it sees 0, interprets it as an integer and decides that if both halves of the : have to have the same type, then that type will be integer. The fact that this is in the middle of an assignment to double means nothing.

In order for x to equal the expected 1.4, you have to write:
double x = (false) ? 0.0 : 1.4;


I'm not even saying that Visual Studio is wrong. It's different from gcc which leads to platform specific bugs, and it's unexpected so the unaware coder will type in the unwanted form and not know something could possibly go wrong.

[Update: Russell Finn goes into more language lawyer parsing of this, and believes VS C++ is wrong. (I don't know Mr. Finn, I just like to read my Site Meter page to see how people get to my blog.)]


[Update 2: And don't let me get started about how VS C++ allows you to put the closing angle brackets of a nested template declaration without a space, as in:
vector<pair<string, int>> myVectorOfPairs;

]

Thursday, October 23, 2008

News of the Not News: People Like Free iPhone Apps

This is one of those blog posts where the author has two data points and has to write an entire entry about those points. Here is the data: My iPhone app to help HDHomerun users align their antennas, Signal GH, went 5 days without a sale. This despite getting a favorable mention on a well known podcast on Friday. So I temporarily started giving it away to boost exposure and reviews. A day later, I had 205 downloads from the U.S..

So $2.99 equals 0 unit sales per day == $0.00 profit
And $0.00 equals 205 unit sales per day == $0.00 profit

At least I'm breaking even.

I believe a large number of these downloads are pointless, as you have to have an HDHomerun to do anything at all with Signal GH. There are people who troll the back pages of the utilities section of the App Store, downloading everything free. I just hope these people don't write reviews.

Thursday, October 16, 2008

Thankful to be a Cocoa Programmer

The company I work for is interested in Microsoft's .NET framework; I am not. But, I was scrounging for free content on iTunes the other day and came across the .NET Rocks podcast, and out of idle curiosity downloaded a few episodes. They are well done, and put together by friendly people with an unflagging enthusiasm for Microsoft technologies. And they make me very happy to spend much—unfortunately not all—of my time working on Macs and iPhones.


Apparently, the target audience is comprised of developers putting together custom business applications; the kind of vertical apps corporate America (and apparently Dubai) consume by the megalines of code. Not the general purpose, high degree of finish, large user base applications I've spent much of my career writing, but database frontends and the like. It's the Bizarro world out there, where right justifying labels is considered a major advance in GUI development.


Any iPhone developers would be well served listening to the recent cast on Windows Mobile. As someone who was scrounging for work when the Mac had 3% market share, I can sympathize with the pathos of a product manager for Windows Mobile trying to put a brave face on disaster, but come on. This exchange pretty much sums up the level of wishfulness and straw grasping:
Host: So let me ask the question a different way: the next version, whatever it's called; should Apple be scared?
Rob Tiffany: Very scared.
Hosts: (laughs) YES!
Rob Tiffany: Very scared.
Hosts: I knew it.
Rob Tiffany: Yeah, yeah, we're working on some secret sauce out there.
Host: Not too secret anymore!


I will congratulate the hosts for not being blind to WinCE's current flaws: they gave Rob a hard time about how the phone app on their phone was glacially slow; I'm just amazed they think it will get better. That old saying about a second marriage being the triumph of hope over experience..


And the episode on complying with the corporate governance rules of the Sarbanes Oxley law... If I had to do that sort of thing, I'd seriously consider going to work at Home Depot. How does one show up at work every day doing that sort of thing?


And that's the thing. I'm nearly always happy either going into work—assuming I won't be spending the whole day fixing OLE bugs on the PC—or pulling out my MacBook and add a refinement to an iPhone app. Life is sweet. I get to work in an application framework which was designed right from the start; light weight and powerfully elegant. I'm not one of a hundred cogs living in a condo in Dubai; I'm a sole practitioner, or an unblended part of a small team. I write software people don't have to be paid to use. I don't have to wait for the secret sauce which never comes. I am a Cocoa programmer, and for that I am thankful.

Saturday, October 11, 2008

The Loss of Homogeneity: iPhones on the Network

Developing for the iPhone and iPod Touch is nice: limited hardware differences leads to reliability. If it works on my iPhone running the 2.1 OS, it should run well on your iPhone. They are pretty much the same. Homogeneity is good when it comes to testing for program errors.


So your iPhone is the same as mine, but your network is not.

I submitted a new version of Remote Remote GH for OS X Touch to Apple a week ago. It had been gradually gestating and stabilizing has I had time to improve it. The last several weeks, it had been quite stable, and had never been much of a crasher to begin with. Thursday night, Apple notified me that the new version was available for distribution on the App Store; Friday morning RRgh crashed on me on launch. Wha?

Backing up. I had been playing with the XBox Media Center (XBMC) for the AppleTV, and last night I had tried turning on its UPnP (Universal Plug and Play) server and client. The new RRgh's big feature is support for auto-detecting MythTV frontends via UPnP, so having another UPnP device on my network meant my code had to check to see it wasn't a MythTV. Unfortunately, this revealed a latent crashing bug where I was expecting a string, got a NULL pointer instead, and boom, boom crash. (No actual audio percussion, just a silent quit.)

An early updater experiencing (what I assume is) the same problem figured out my e-mail address and let me know that I wasn't the only one.

So, I had to quickly remove RRgh from distribution pending a 1.1.1 release, as I have no wish to get a deluge of negative reviews from whatever small fraction of the populace would be affected by this. Which means, there is no RRgh currently available for download, as I don't have the option to revert to the 1.0 binary.

If you are one of the people who downloaded 1.1, I hope it isn't crashing on you, but I've submitted an updated binary which should be out within the week.

Thursday, October 02, 2008

News of the Not News: QuickDraw Printing is Dead

As I occasionally let on, my day job is the care and feeding of an ancient, semi-well known, Carbon application (and worse its Win32 doppleganger, and worse still its Win32 doppleganger's OLE interface). One "perk" is the occasional confrontation with the living dead. The dead in question is how we print. We print using a very old and elaborate system of QuickDraw calls spiked with embedded Postscript. This creates very nice output. On Postscript printers. From PowerPC Macs.
For Intel Macs, not so good. Why? Because there is no embedded Postscript in Quickdraw printing support for Intel Macs. It didn't make the cut of creaky technologies Apple bothered porting over during the Intel transition. So, yes, text and polygons look OK, if imprecisely positioned, and that's most of our output; but anything splined based is bad as there are no splines in QuickDraw only very big polygons.
So, users of ancient Carbon applications: check out the printing on your shiny new Intel box.

Tuesday, September 23, 2008

How I Didn't Get Rich Via my iPhone App

I think we've all seen the stories about some guy in New Zealand making $6K a day for a Choplifter clone, or another fellow making $250K in two months reimagining Tetris. What you don't hear about are the flops. And I have a flop on my hands. My application to monitor over the air digital television signal quality, Signal GH has sold 10 copies. I had had no illusions that it would be a big seller; the target market is small at the intersection of iPhone owners and HDHomerun users; I just assumed there were a few thousand such people and they would all want my app.


Apparently not.


Unlike a lot of apps getting shoveled onto the App Store, Signal is competently coded; performs a useful service, launches quickly, doesn't leak, doesn't crash, and has only a smattering of bugs. Maybe prospective buyers would like to know this, but I have had no reviews so far.


Part of the problem was that Apple mistakenly lists its release date as the day I submitted it to them for testing, instead of the day they actually released it to the public, so I got no time on the front page of the Utilities page. And part is probably too high a price point, maybe $5 is just too much even for niche market apps. Again, I'm flying in the dark here because I've gotten zero feedback.


I also don't know where HDHomerun users hang out. The forums on Silicon Dust seem fairly low volume. Maybe if the HDHomerun were buggier, people would be filling their forums but dang it if the HDHomerun isn't the most reliable gadget on my network. And, I don't want to make commercial announcements on public forums anyway; nobody likes that sort of thing. I'm not a marketing wiz, I am a fairly good Mac coder.


Oh well, I only wasted a couple weeks of spare coding time. I've learned my lessons about targeting larger markets. I learned a thing or two about iPhone development, and hopefully my next post on the subject will be "How I Did Get Rich Via My iPhone App."

UPnP with Cocoa

Needing a license compatible implementation of UPnP I could use to locate MythTV frontends, I made the mistake of using the C++ version of CyberGarage's library. This was a mistake as the C++ version (1.7) is old and filled with bugs and leaks. The proper version to use is the C version (2.2) which is much less buggy and even comes with an Objective C wrapper class.

If you are writing an iPhone app and need to locate a UPnP device, this is probably the way to go.


Oh, and one thing to consider is that if you look through the code, you will see that the library avoids opening up connections with network interfaces that have the IFF_LOOPBACK flag set—for obvious reasons. You might want to also avoid network interfaces with the IFF_POINTOPOINT flag set, as that is as like as not the cell network radio, and you probably don't want to make UPnP inquiries over the cell network.

[Update: Looks like TCM Port Mapper might be the preferred library, but I haven't tried it.

Wednesday, September 10, 2008

iTunes 8 is still not Cocoa

Earlier in the year, I posted a blog entry about how Apple, obviously, must be getting ready to release a new version of iTunes based upon the Cocoa framework, and not the Carbon framework. While I still believe this to be true, iTunes 8 is not that product.


A quick look through the application's bundle shows all the signs of a product still deeply welded to Carbon.
A quick glance through the Info.plist:
  • No Principle Class so no NSApplication at launch
  • The HIWindowFlushAtFullRefreshRate flag is set indicating at least a few Carbon windows
  • Application requires Carbon environment flag is set


There are a few .nib files in the bundle, but all the ones I looked at were Carbon.

Localized strings are still kept in a resource (.rsrc) file! What in the world.


In fact, there are no obvious signs of any Cocoa use at all.



As someone who's day job involves maintaining a Carbon app that "should" be transitioned to Cocoa, I can sympathize. But it is concerning as Apple wastes development time adding features like Genius to a straight Carbon application. Anybody writing new features will know that their work will have a shelf life of about 9 months; and will have to be reimplemented—at least the GUI parts—in Cocoa. And the sooner a Cocoa iTunes comes out, the faster it will become a mature, bug-free product.

Thursday, September 04, 2008

Revamping a Dated Cocoa Application: InCDius 2.5

iPhone development has re-energized my love of programming. My day job requires me to do too much Win32 coding, and that is soul deadening for a Cocoa programmer. The combination of corporate priorities, and the fact I can add features much more quickly and with fewer bugs in Cocoa, leads me to spend a huge fraction of my time in Visual Studio, when I'd rather be in XCode. It's been draining.


I released InCDius 2 back in 2002, and it never got out of beta, although beta 16 was quite stable, and I apparently have a number of loyal (yet put upon) users. It is just an Audio CD database, not a personal media database like the well regarded Delicious Library. It was fast, and pretty darn stable, although a few users have had database corruption issues—please back up to XML people. I knew for years that I should release a new version, but could never quite muster the energy. It embarrassed me having this old, non-Intel application, gradually getting less stable with each OS release application; and I was considering my options for killing it entirely.


But my Remote Remote GH for iPhone application makes heavy use of the SQLite database, so I felt confident I could transition InCDius away from the Berkeley DB database. This transition had been the most daunting of the reasons keeping me from releasing an update.

My first impulse was to refactor the whole application using Core Data, but that would not have added anything but development time; the GUI was wired up quite well; the application was already scriptable. All I needed was a new database backend, and Apple ships OS X 10.5 with SQLite 3 right in /usr/lib. By choosing SQLite, I was no longer responsible for compiling and upgrading my database library; I can rely on it being there and just working. I tried using the QuickLite SQLite wrapper, but it was not capable for the task, as it creates a large number of objects maintaining its internal cache, to the point of locking up my Mac. No, I had to call the SQLite C API directly. As long as I minimize random access into the database, I get good results.

I also wanted this to be a release that could live on its own for a very long time; who knows how long I'll go without finding the time to update it. I wanted code which would last a very long time. This meant not only transitioning to Intel, but compiling for Intel 64-bit, and removing every deprecated API or framework message I could root out. I decided to make the minimum OS X version 10.5 (Leopard), and to transition the codebase to Objective C 2.0, although it isn't quite all there yet. I'll have to work on the garbage collection, but I am using both Objective C 2.0 fast enumeration loops and properties where appropriate.

Forward thinking means removing the past, so out went a few features tied to old APIs. I had been allowing users to import information from Classic Mac OS's CD Player preference file, but that involved using old school Resource Manager calls. Anybody ever going to use this feature again? No. Then out it goes, along with the Resource Manager.

I had 2 separate ways to play Audio CD tracks. One used QuickTime; one used Core Audio. Kept the Core Audio.

NSTableView has a new (to me) way to deal with selections with NSIndexSets, and deprecated the messages I had had been using. Goodbye old messages.

My own Objective C objects were filled with ints, unsigned ints, longs, and even the occasional unsigned. Hello NSInteger, NSUInteger and 64 bit computing.

I had been using a custom version of NSTableView to draw blue stripes. Goodbye custom class, hello check box in Interface Builder and grey stripes.

Obviously, I had been using .nib files for building my GUI. Time to upgrade everything to .xib files.

A user sent me his database (as a .zip archive of XML files) of 15,000 individual disks. Wow, and I had been thinking InCDius was zippy fast and ready for release. Massive performance tuning and the decision to load the database at runtime instead of upon the first search. Much simpler and reliable, and lets the user do something else while everything gets ready.

Take time to remove all the warning messages. You will save yourself a huge amount of hassle in programming, but especially in Cocoa programming if you eliminate all the compile warnings. You want to know right away if the compiler doesn't know if NSView responds to "releese". Do not let warnings build up. Fix them. Fix them all.

A Google Tech Talk by Linus Torvalds inspired me to change the unique identifier key I had been using to lookup disks in the database; he uses SHA1 hashes to avoid corruption in Git, and it seemed to me that a SHA1 hash of the Audio CD's Table of Contents would be as close as I could get to being unique; although there will be rare instances when 2 CDs cannot both be in the database.

So after all this, what do I have? A fast, reliable(?), clean, focused little application, which I can send out into the world; spend the next few months fixing the occasional bug and adding the occasional feature, and which people can use for a good long while.

Thursday, August 28, 2008

Signal GH - iPhone App for the HDHomerun

[Update: It's finally in the iTunes App Store.]

I submitted my second iPhone application to the App Store this morning. Signal GH is a utility to monitor the signal quality of over the air television for American and Canadian users of the HDHomerun. I'm charging $4.99 for it, which seems a fair price given what a small market I'm targeting: the intersection of iPhone and OTA HDHomerun users. Anybody who's set a retail price knows how hard it is. And I'm sure there will be a lot of people giving me one star and claiming it should be free, or one star and claiming they didn't understand you needed an HDHomerun. Sorry guys, gotta send my son to day care, the requirements are right in the first sentence of the blurb.


The Silicon Dust engineers did a great job with their libHDHomerun API, anybody wanting to put out a C API should check it out for its use of the language, platform ambivalence, and object like characteristics. Of course, they then released it under LGPL, making me wait the last couple weeks for them to modify the license. But they did, and I promptly submitted.


Now just step back and let the pennies roll in.

Saturday, August 23, 2008

Few USB Car Chargers work with iPhone 3G

[Update: an earlier version of this blog entry said that the 2A Griffin PowerJolt was able to charge my iPhone while playing audio and showing the GPS map. Additional testing showed this not to be the case.]

As I spend 4 hours a week driving my car to and from work, I need my podcasts to keep me sane. Before I bought an iPhone 3G last month, I had a very reliable cigarette lighter to FireWire cable hooked up to a tiny dock which I would plug into my reliable iPod Mini. Apple, to save space, removed Firewire charging on the iPhone 3G thus forcing users to go with the inferior USB style charging. In a quest to replace my previous setup, I found out the hard way how inferior USB charging is.


To replace my original micro dock, I purchased an Apple Component AV Cable. This gave me a USB connector for charging and 2 RCA cables to plug into the convenient RCA ports I had previously installed in my Civic. And to think I was mocked for using RCA instead of a mini jack. Of course, this left me with 3 useless component video connectors, but they were easily bundled up.



Now I needed a cigarette to USB adapter, so off to Amazon. I thought the Mace Group USB Charger looked compact.


And was soon happily zipping back and forth between work and home before discovering something: it wasn't actually charging my battery. In the meantime, I had purchased another copy for my wife's car, and the iPhone did not even recognize that has being connected. So back they went to Amazon.


Needing charger for a long day of taking a visitor around town, I picked up a PowerDuo/PowerJolt by Griffin at Circuit City:


That seemed to go alright for a couple days until I left the iPhone displaying a GPS map for half an hour. What's that smell? The smell of the PowerJolt overheating.


OK, might need something more heavy duty. My, this Black & Decker Power to Go cup holder charger looks the part.

Turns out it only delivers 350mA to the USB port. The iPhone won't even notice you are trying to charge it.


How about this Kensington Car Charger for iPhone and iPod rated at 1 Amp? It says right on the box for iPhone (no mention of 3G)


Well it didn't overheat, but it failed my listening to music with the GPS running test, couldn't quite keep up.


Finally, going back to another Griffin PowerJolt from Best Buy, this one larger and with a 2 Amp fuse.


Here we have a winner. [Correction: During a longer test, the iPhone's battery did not build charge while playing audio and displaying the GPS map, but the PowerJolt kept cool and was nearly able to keep up.] Driving around while listening to music and with the GPS on, it was actually able to top off my battery. And this model has the new, don't start smoldering feature.


It turns out that what any given USB port will deliver in terms of power varies widely: one charger might deliver 5 times as much current as another. I would recommend finding a charger with the highest available maximum amperage, something around 2A like the more recent PowerJolts, and avoiding dual port chargers unless they can guarantee a high current to both ports.

[Update: What is strange about this is that according to System Profiler, my iPhone lives on an allocation of 500mA while connected to my MacBook.]

Thursday, July 24, 2008

Remote Remote for MythTV for iPhone

I've written a "toy app" for OS X Touch, a version of my Remote Remote GH for MythTV application which acts as a remote control over the wireless network to control a MythTV. It's a toy in the sense I am mainly using it to learn the iPhone SDK, and not to earn money, or even a large user base. I don't expect it will find a large following, I was just learning the ways of the iPhone.


Coding for the iPhone is different. Not so much in the GUI classes or the tools, but in the outlook it requires of the engineer. I was working in a resource starved environment; one in which I needed to take care not to run out of memory, not to use network unless needed, and not to spend a lot of time launching or quitting. An environment, where I had to justify every button on the remote control as space worthy. I leaned heavily on Apple's performance tools, much more than I would on a Mac. And, I refactored constantly. Every time I learned something new, I would change my code to work better and better fit the SDK way.


I also needed to use SQLite extensively, and felt compelled to adapt the QuickLite framework to both Objective-C 2.0 and the iPhone. If Tito Ciuro reads this, please send me an e-mail.


And finally, I'm quite proud of most of the artwork:

It's harder than it looks coming up with simple, thematically consistent, iconic graphics. Thanks to my wife for telling me try again after my original app icon design mashed together a picture of a widescreen TV with a remote drawn in perspective. I went with a simple rendering of navigation buttons, which I liked so much I made the buttons part of the actual GUI.

Sunday, July 20, 2008

My Semi-Annual Rant about Compiling MythTV

So, I am testing a little iPhone app to control MythTV, and I decide I should upgrade to the current bleeding edge build. Which is bad enough, as I had Qt 3.3 installed on my Linux box and the bleeding edge is Qt 4 based; requiring figuring out what my PATH, QTDIR, QTMAKE, variables are required, and then ending up with an upgraded database making it hard to go back. And then finding out the the remote control interface is pretty severely broken in a couple places, which I either had to fix or cut back on the features of the app. And then finding out that the mythweb is pretty broken, and that live TV is not improved. Etc.

Bad enough, and then I had to upgrade the Mac frontend to match the new version of the MythTV protocol. As if it it never occurred to the guys at MythTV.org that standards should be flexible enough not to require constant changes. Ever notice that on top of every well formed XML file it says <?xml version="1.0" and that people have gotten along pretty well without changing the definition every 3.5 weeks. After a couple of hours of trying to get the frontend to compile on my MacBook, which has XCode 3.1, and an older version of Qt futzing things up I decided that the build scripts really didn't like llvm, and installed XCode 3.0 on the Mac Mini in the basement. Then it was a couple hours of fixing a few errors in the Mac specific code, and realizing the main configure file had to be made executable.

Then after it finally compiled, it was figuring out why I couldn't turn off pinging the database server (bad mysql.txt permissions), then why the frontend quit right away with the only clue being in the Console: "MythTV requires an updated schema". So the database schema had changed in the last 3 days, causing the frontend to quit without any warning. Classy.

And the playback is absolutely horrible on the Mini for some reason involving TOSLink passthrough or problems with linking in a library—I'm not sure. I really ought to buy another Mini, one which I can run as 100% OS X, and use EyeTV.

Saturday, June 14, 2008

Vectored PICT to PDF conversion in your code

You should not be creating any PICT files, but longtime Mac users might have a large number of .pict or .pct files lying around, and some (all?) of the tools Apple provides to open such files and convert them to PDF do a horrid job of it. This applies only to vectored PICTs, as bitmapped PICTS will look bad regardless.


First I refer you to this screen capture of a simple AppleWorks drawing:

You will notice that it looks a little archaic with it's lack of anti-aliasing, but that's what people had in the day.

I copied and pasted this image into TextEdit.app, saved the document as RTFD and then used the "View Contents" item in the Finder contextual menu to get to an actual old fashioned .pict file. This is how hard it is these days to generate a PICT file.


Just to see how not to do this, open the .pict file in OS X Preview.app, zoom in a bit, and you will see that however Preview is rendering PICTs it is doing a pretty poor job of it. I'm guessing it is using QuickTime import to create a bitmapped version.

Tell Preview to save as a PDF and you get this mess:


The basic problem here is that Apple provides an API for rendering a PICT into a Quartz context (and thus into PDF) which preserves the vectored nature of the original and some applications do not use it. If you use this API, your onscreen representation of PICT files will be as good as they can be, and you will be able to export them to comparatively nice PDF files. This does not mean they will look as good as PDFs which had been created from the ground up as such; PICTs lack of support for transparency, Bezier curves, fractional coordinate systems and rotated text make that impossible. But it will look a lot nicer.


QDPictRef LoadPICTFromURL(const CFURLRef& pictFileLocation)
{// warning, I did not actually compile or test this code
QDPictRef result = 0;
CGDataProviderRef dataProvider = CGDataProviderCreateWithURL(pictFileLocation);
result = QDPictCreateWithProvider(dataProvider);
CFRelease(dataProvider);

return result;
}

Look around the header files for the QDPictToCGContext.h header and you will find:

QDPictDrawToCGContext( CGContextRef ctx, CGRect rect, QDPictRef pictRef);


Then you can use the resulting QDPictRef to draw into a Quartz context, and if that CGContextRef was created via a call to CGPDFContextCreate, then you have created as nice a copy of the original PICT as is possible as seen by this screen shot at 200% zoom:

and this PDF result:


Note that rotated text still looks horrible, as the only way to make QuickDraw draw rotated text onscreen was to draw into an offscreen bitmap and then rotate the pixels in the bitmap. It was possible, however, to insert a series of PicComments which inserted rotated text into a PICT. I've checked this out with PICTs created by a separate application, and the Apple PICT to PDF converter honors these comments. I guess AppleWorks just didn't bother to put them in. [Update: Paragraph rewritten to add extra info, and to hide temporary idiocy on my part.]

Also check out the little bump in the arrow heads, probably a glitch that went unnoticed in the non-antialiased original caused by drawing the shaft too long and not quite on center. Otherwise, the new antialiased version looks much nicer. And you can even select and copy the (non-rotated) text right here in the browser.

It's not unusual, but unfortunate that Apple is inconsistent in using its own API. Preview.app obviously does not, nor does the PICT CoverFlow plugin, but TextEdit.app appears to, resulting in the oddity of a PICT in a CoverFlowed RTFd document looking much better than the CoverFlow of the PICT file itself.

Regardless, I recommend that anybody with a large collection of vectored PICTs make PDF copies of them as there may come a version of Mac OS X which will not have any support for PICTs at all. For instance, I doubt you can view PICTs on an iPhone. Warning: see my other posts about how PDFs do not contain the extra data which allows the originating application to recreate the original document. So keep the originals around too.

[Update: I've corresponded with Thorsten Lemke, proprietor of LemkeSoft and creator of the well known Graphics Converter application. He immediately saw the value in incorporating improved Quickdraw picture processing and conversion in his product and version 6.11 and higher (including this morning's beta) will feature it. I envy the nimbleness of independent software vendors. I can't tell you how long my day job company takes getting a minor release out to our customers.]

[Update: here is the source for a simple automator plugin I threw together to make the conversion.

#import "QuickDrawPictureToPDFExporter.h"

QDPictRef LoadPICTFromURL(const CFURLRef pictFileLocation)
{
QDPictRef result = 0;
CGDataProviderRef dataProvider = CGDataProviderCreateWithURL(pictFileLocation);
result = QDPictCreateWithProvider(dataProvider);
CFRelease(dataProvider);
return result;
}

@implementation QuickDrawPictureToPDFExporter

CGContextRef CreatePDFContext(const CGRect mediaRect, CFMutableDataRef theData)
{
CGContextRef result = 0;
if(theData != 0)
{
CGDataConsumerRef theConsumer =CGDataConsumerCreateWithCFData(theData);
if(theConsumer != 0)
{
result = CGPDFContextCreate(theConsumer, &mediaRect, NULL);
CGDataConsumerRelease(theConsumer);
}
}
return result;
}

- (id)runWithInput:(id)input fromAction:(AMAction *)anAction error:(NSDictionary **)errorInfo
{
NSMutableArray* output = [NSMutableArray array];

if (![input isKindOfClass:[NSArray class]])
{
input = [NSArray arrayWithObject:input];
}

NSEnumerator *enumerator = [input objectEnumerator];
NSString* aPath =nil;

while (aPath = [enumerator nextObject])
{
NSURL *inputURL = [NSURL fileURLWithPath:aPath];
QDPictRef aPictRef = LoadPICTFromURL((CFURLRef) inputURL);
BOOL drawn = NO;
CFMutableDataRef theData = 0;
if(aPictRef)
{
CGRect mediaRect = QDPictGetBounds(aPictRef);

theData =CFDataCreateMutable(NULL, 0);
CGContextRef cgContext = CreatePDFContext(mediaRect, theData);
if(cgContext != 0)
{
CGContextBeginPage(cgContext, &mediaRect);
CGContextSaveGState(cgContext);
QDPictDrawToCGContext(cgContext,mediaRect,aPictRef);

CGContextEndPage(cgContext);
CGContextRestoreGState(cgContext);
CGContextFlush(cgContext);
drawn = YES;
CGContextRelease(cgContext);
}
QDPictRelease(aPictRef);
}
else
{
NSString *errorString = NSLocalizedString(@"Picture to PDF could not read the input file as a PICT.",
@"Could not read input file");
*errorInfo = [NSDictionary dictionaryWithObjectsAndKeys: [errorString autorelease],
NSAppleScriptErrorMessage, nil];
}
if(drawn && theData)
{
NSString* outputPath = [[aPath stringByDeletingPathExtension] stringByAppendingPathExtension:@"pdf"];

if (![(NSData*)theData writeToFile:outputPath atomically:YES])
{
NSString *errorString = NSLocalizedString(@"Picture to PDF could not could not create output file.",@"Couldn't write data to file");
*errorInfo = [NSDictionary dictionaryWithObjectsAndKeys: [errorString autorelease], NSAppleScriptErrorMessage, nil];
}
else
{
[output addObject:outputPath];
}
}
else
{
NSString *errorString = NSLocalizedString(@"Picture to PDF could not draw the File.",
@"Could not draw output file");
*errorInfo = [NSDictionary dictionaryWithObjectsAndKeys:
[errorString autorelease], NSAppleScriptErrorMessage, nil];

}
if(theData)
{
CFRelease(theData);
theData = 0;
}
}
return output;
}

@end

]

Thursday, May 15, 2008

Keep your Frameworks from Metastasising

Think Class Library, MacApp, PowerPlant. What do these frameworks have in common? They are all on my resume and they are all dead. Qt, Cocoa, AWT(?), Swing, MFC(?), .NET. What do these frameworks have in common? They are not dead, yet. The question mark indicates a framework in the zombie state of not being improved upon, but being used by too many people to be considered dead.
"It just so happens that your friend here is only MOSTLY dead. There's a big difference between mostly dead and all dead. Mostly dead is slightly alive." --Miracle Max in The Princess Bride



And, I am glad that frameworks die. I would rather not live in a world where 23 years later, MacApp was the best we could do. It was a pain to work with and I'm much happier with Cocoa or even Qt. People learned what was wrong with the old frameworks and made better ones. (If only this were true of MFC.) And, baring asteroid collision, people will come up with new frameworks in the future.


And that is the rub. Frameworks get you 2 ways. 1) they usually lock you into writing all your code in a particular language. If the next great application framework uses Python, and all your work is in C++, you're going to be awfully busy. 2) You drink the Kool-Aid, and use Framework classes everywhere. If you are a Qt user, half your methods take QStrings, and when the time comes when you want to switch from Qt to Cocoa, you will be re-factoring for weeks.


The language lockin is tough. I've known for years that C++ is not really a great application language; it may be a decent OS level language, but it is too fragile, too demanding, too static, and too complicated for making reliable desktop applications. But what else are you going to use for cross-platform apps? Maybe .NET will kill C++ as a cross-platform language. I don't know; it'll be a while. Hopefully, an appropriate cross-platform language will rise up to take its place. Which is part of my point, 10 years from now, we will not be using the same language on the same framework to do development. Something better will arise.


Getting back to the point of this entry, you want your code to last longer than any one framework. It's a matter of amortization. The longer your code lasts doing useful work, the higher the payoff for writing it. Over the 16 years I've been programming Macs, I've used 5 frameworks, and 3 of them are dead. I am not predicting the imminent death of Cocoa or Qt, far from it, but die they will, and I should have a backup plan. And here is the other rub, my backup plan 'til now has been to write cross-platform, pure C++ code using a lot of STL and boost, and try to keep the platform specific code from creeping into the general purpose code. But I just said, perhaps prematurely and wistfully, that C++ itself might fall out of favor with framework developers, which both makes cross-platform development more difficult without a Lingua Franca, and negates the hedge against your framework dying.


So I don't have a long term solution, but in the meantime do what has been good advice. Keep your use of frameworks limited to a thin GUI layer on top. Abstract the interfaces between your code and the framework. Abstract, abstract, abstract. To the extent practical, do not propagate framework classes into the meat of your codebase. If people had done this in the past, they could have skipped from MacApp to PowerPlant to Qt with a song in their heart instead of the crushing pain it was for most folks. Do not get locked in without a very good reason.


And know when it is time to scrap your live's work and move to a new language.
...watch the things you gave your life to, broken, And stoop and build 'em up with worn-out tools... --Kipling

Tuesday, May 13, 2008

Quartz to PDF, versus PS to PDF

I am ankle deep in Postscript code at my day job, so as a refresher I took an afternoon and hand encoded a business card for my wife, who is starting a side business arranging academic tours of China (for the Peking University Department of Philosophy and Religion). Obviously, most people would be better served creating a card in InDesign, or other vectored drawing editor, but this was a learning experience.


Here it is with the contact info scrubbed.



Postscript is not a friendly language, but it was simple enough creating a .ps file in my text editor—BBEdit—and just dragging and dropping the icon from the editor's title bar onto the OS X Preview application. Preview has the convenient feature of auto-magically converting Postscript files to PDF. Of course, compilation errors result in a cryptic failure dialog, but these were enough development tools for an afternoon's project. If I were to do this on a more regular basis, I'd compile a simple app which provided a message callback to the CGPSConverterCreate routine.


Consider how much easier it would be to create this file in Quartz, and how much better the output would look.

  • Font handling is hard in Postscript, even if you don't have to embed descriptions of your fonts. Thus my use of standard Postscript fonts Times and Helvetica.
  • Kerning is not automatic in Postscript. Yes, you can use kshow to manually set the spacing between pairs of characters. No, I'm not going to do that. Thus, the odd spacing between letters.
  • If this file had included characters from non-Roman languages, complications would arise as Postscript only allows 256 characters per font encoding, potentially requiring multiple font definitions per face. None of the free Unicode support in Core Text/Quartz.
  • I might have used a little transparency, but there's no such thing in Postscript.
  • Postscript is hard to read and maintain. The extra syntax needed to keep the parameter stack organized, distracts the eye away from the actual algorithm being described. Postscript is certainly more compact in the editor, by an order of magnitude, than a series of Quartz API calls but much of that compactness is wasted pushing, popping, dup'ing, and rolling the parameter stack. I mean, just look at it: [Update: changed ATSUI to Core Text]
    gsave
    basefont [9 0 0 9 0 0] makefont setfont
    (Nashua, NH 03061)
    dup
    stringwidth pop 2 div neg 0 rmoveto
    show
    grestore



The fact of the matter is that Apple has done a lot of heavy lifting for us either via proprietary extensions to PDF, or taking the extra effort of providing optimized support for font embedding. An afternoon spent trying to keep track of an unruly parameter stack was enough for me to appreciate how much power Quartz gives us, and how easy it is to call upon.

Thursday, April 03, 2008

What does it mean when your app launches faster in Debug?

Ummm, maybe you forgot to uncheck "Open using Rosetta" in the release build?


I was enthusiastically awaiting how fast my day job application would launch now that I've eliminated the slowest thing. The debug build launched fast, the release build launched fast under instrumentation, the release build launched really slow from the Finder. What?? Was about to go up to bed, when it occurred to me I had switched on Rosetta for something months ago, and might have just forgotten to turn it off. Yep. Application launches quickly now. Users might even notice when this gets to them in a couple months.


By the way, when you are using Apple's Core Foundation XML Parser, you really really want to use CFTreeGetFirstChild followed by a series of CFTreeGetNextSibling calls to traverse the elements. You really don't want to call CFTreeGetChildAtIndex on anything approaching a large XML document.

Monday, March 17, 2008

iPhone App Ideas: A Target Rich Environment

I was watching a recent episode of Anthony Bourdain's: No Reservations on the Travel Channel this morning, and came up with an idea for an application.

Bourdain described the process in which orders are conveyed through a restaurant: customer tells waiter, waiter writes on order slip, waiter goes to waiter station and inputs order into terminal, order gets directed to the appropriate kitchen personnel. I was struck by how the input terminal involved bringing up a diagram of the appropriate table, and inputting the order for each chair around the circle. This seemed something easily transferable to an iPod Touch style interface.

I'm imagining a situation where the customer tells the waiter, the waiter touches on the table in a map of the restaurant, touches on the seat, is given a series of menu choices corresponding to things on the menu, touches complete, and the order is sent directly to the kitchen. Maybe the customer orders shrimp and the kitchen just ran out of shrimp, the software would show an alert.

In this scenario, the waiter only has to enter the data once, there is less chance of transcription errors, the restaurant does not have to set aside valuable floor space for the waiter station, and status information is more current. Presumably, the people best positioned to make such software is whoever makes the current restaurant terminals, but some enterprising types could get into the business right now while the going is good, by providing an entire Mac/iPod based solution which incorporated the whole system: data entry, accounting, kitchen display screens, etc. If anyone uses this idea, cash is always an appropriate gift.

It would be nice if Apple were to come up with an industrial version of the iPod Touch that was not worth stealing as a media player for specialized situations like this.

Wednesday, March 12, 2008

The iPhone and the End of Bloatware

You still see tables filling full page ads in computer magazines, a column of checks for the product being advertised, a column of blanks under the competitors. This was the game of the big company, the Microsofts of the world; organizations with large enough staffs to code solutions to any need or perceived need of any customer big or small. Every customer needs 10% of Microsoft Office, but every customer needs a different 10%. Smaller, more nimble companies have made better pure word processors, but they have all been relegated to niche markets, as the vast majority of customers know that somewhere in Office are the exact features they need, plus a bunch of features they might need someday for free. It's just a matter of finding them. And in a world of full keyboards, contextual menus, 24" monitors, gigabytes of RAM, terabytes of storage, multi-cores, and fast CPUs, there's always somewhere to tuck a feature.

Now comes the iPhone/iPod Touch with its 480x320 screen, no keyboard shortcuts, finite memory and demand for near instantaneous launch and quit times. Ponder this small subset of toolbars for Word 2003 on Windows:


and imagine that each icon was twice as big to accomodate the stubby adult forefinger. Imagine making even this small subset of functionality accessible somehow while still showing content. I've a good imagination, but it fails me here.

Anti-bloat features of the iPhone OS:
  • Apps must launch quickly (in practice anything less than a second will seem intolerable), and quit as fast or faster
  • Apps must be relaunched every time you move back from another app.
  • There is a practical limit of 10 or so toolbar type icons per view, and 5 is more reasonable. This leads to a geometric decrease in accessible functionality versus a program on a PC. If you can fit 40 toolbar icons on your computer's monitor, and each opens a dialog with 40 items, and each dialog item invokes a subdialog with 40 features you have 40x40x40-40-40=63,920 accessible features versus 5x5x5-5-5=115 features 3 levels deep on an iPhone (well you could probably fit more than 5 items per view, but you get the idea)
  • There are no contextual menus, much less multi-level contextual menu trees
  • There are no keyboard shortcuts, eliminating the need for laminated keyboard shortcut hint sheets Scotch taped to the back of the iPhone
  • There are gestures, but really, how many unique gestures can you expect a person to master: pinch, shake...?
  • There is a finite amount of RAM, and telling the user to "just add more" is not an option
  • Lack of both RAM and application interaction are a firewall against the metastasized bloatware which are "Office Suites," which have to at least pretend to work together

No, what we will get are mini-apps. Apps a single competent programmer will bang out and polish in 3 months or less. The functionality which makes up Office could be broken up into scores of such mini-apps: a mini-table creator, a mini-slide presenter,a separate mini-slide editor, a database client, a little drawing app. In fact, even these small apps will probably be too big, and will be streamlined for more specific task oriented, specialized purposes: a blog post word processor, a series of charting modules which each do one type of chart, a specialized Keynote which only has the tools needed to create presentations following a single corporations style—imagine all those exactly the same style presentations you see at Apple events. That would be doable on an app which could only show 5 toolbar icons at once, and which relied upon gestures, animation, templates and transparency to maximize what the user can see and do. And, lots of apps to help with one's location based social networking, whatever that is.

In the iPhone ecosystem, bloatware is non-adaptive. The big have no feature advantage over the small.

The experienced Cocoa programmer does have an advantage, however. I have been reading through the Apple example code, and it is made up of delightful exercises in object oriented minimalism. Actual, full blown apps written in this style will be incredibly light weight for the functionality delivered, with zippy launch times, and responsive interfaces. This seems to be how one masters a framework, and it is especially true of Cocoa, by learning how to do the most with the least code, which is again, the anti-thesis of the bloatware mentality, which seems to only care about churning as much code out as possible.

Friday, March 07, 2008

Re-imagining a Desktop Utility into Cocoa Touch

If anybody's starting up a business based around iPhone development, and need a chief engineer, I'm sure you can find my e-mail address.


Well the momentous day came, and the iPhone SDK is in our hands. I wish I were not so incredibly busy at my day job, or I would take a couple weeks off and learn the ins and outs. But, I've decided I should start small and port a little application I wrote, Remote Remote GH for MythTV, which I'm afraid I have not been properly maintaining, but has the advantage that it would be much more useful on an iPhone than on a laptop, and it only uses Cocoa and various Core Foundation routines. I'm going to go through the exercise here of paring the interface down to size and making it more in touch with the Touch's way of doing things. Then sometime in the next couple months, I'll take the time to actually do it.


Let's look at the original and see how we can get it down to 320x460 and 480x300 (which are about the available sizes once you discount the "status bar" region of the display). Not only do you have a smallish screen, there is inconvenient text entry, and any widget on screen should be at least 44x44 pixels to deal with the stubby fingers of adult humans. On the plus side, there is multi-touch and 3D movement via the accelerometer.



  • Disconnect/Connect Button This should probably be done away with entirely, and the app should try to keep a connection up when visible.
  • Show Keys Button This brings up a list of keyboard keys which are appropriate for the current state of the MythTV. The idea was for the user to control the MythTV with the same key presses used when sitting in front of the computer running the MythTV frontend. Keyboard input on the iPhone being discouraged, replacement with a series of contextually appropriate button panels is appropriate. (A panel for watching recordings, a panel for navigating, a panel for DVDs, etc.)
  • Scrolling Status Console This was a area devoted to telling the user what has happened. This can be condensed into a 2 line status widget at the bottom of each screen.
  • Navigation Buttons Live TV, DVD, etc., can be put into a standard navigation bar, to allow the user to jump to a common task, which I will pare to Live TV, Recordings and Video
  • Recorded Programs Table This can be moved to its own panel, and condensed, with perhaps a sub-panel which the user can navigate to with more complete information for each recorded program.
  • Big Scrub Bar I like the idea of making very long scrub bars to allow for fine control over where in a program the user can jump. This is a good candidate for a control which is always along the long axis regardless of the orientation of the device, and will only be visible in play back mode panels.


As I said, I have very little time on hand, but I was able to make a first step of compiling my network protocol classes, which compiled after a mere 5 code changes—removing #include "Cocoa/Cocoa.h", and replacing call to Carbon's NewPtr(...)/DisposePtr(...) with malloc(...)/free(...).

Then it was just a copy/paste job to make a stripped down version of my original application delegate, which just stuffed replies from the server into a simple UILabel in my main window. Amazingly, this worked on the second try, with a connection forming with the server, followed by periodic requests for status. If anything, the networking code is more solid than in the desktop application.

I'll update this later, as I start to build the GUI, and figure out how best to minimize my power usage (keeping the Wi-Fi unit off for extended periods).

Monday, March 03, 2008

On the Badness of Skins

I refer you to this posting on the product forum for the CoreAVC player application. An application which claims to have extremely efficient H.264 decoding.

Here is the company representative explaining why the Edit menu always comes first , instead of the File menu as Jobs intended it.

CoreUI, our skin layer, is not meant to hardcode things like "File". There are some apps that don't deal with any file and still require text editing. And we can't wild guess where they want the Edit menu. The thing we could do is add a special element in the skin that could place the Edit menu where you want it to be...


Now, I would very much like to see better efficiency in H.264 playback, especially on the 1.6 GHz Core Duo Mac Mini on my wife's desk. That would be a good thing. But here is a cross-platform application whose creators can't be bothered to put together a simple Cocoa shell application for their player, and believe me this would not be a lot of work for a semi-competent Mac guy to throw together. Instead, they've ported their skinning engine—which I'm sure they are quite proud of. The engine is extremely flexible, except in the sense that you can't put together a skin which doesn't make their programmers look clueless.

Let's see. How much time to put together a nice Cocoa application to host your player: X. How much time to port, and debug a "flexible" skinning engine: maybe 4X. So, precious development time which could be spent elsewhere is instead spent on skinning: a misfeature given to the world by WinAmp, and repeated by seemingly every other media player intent on letting skinless iTunes eat their lunch.


You want to know a big reason WinAmp, or MusicMatch (which I worked for) or whatever, couldn't keep up with Apple's iTune team? A big reason was skins, if you are spending 30% of your development time maintaining skins and the skins engine, and you start rejecting new features because they don't fit into the design of the engine, or because you have to add new artwork to 10 separate skins, and you have your best minds trying to figure out an abstract design for docking arbitrary subwindows, and your testing staff spends all day long looking at every permutation of new feature and skin, then it's going to be like you are fighting Apple with your arm tied to your leg, and your leg is bolted to the floor.

And for what? A feature that the vast majority of your user base is at most going to flip through once and forget exists.

BTW, CorePlayer, at the moment, is virtually useless on the Mac, as it lacks AC3 passthrough. You know an actual feature involving playing media. Maybe, if they hadn't spent all that time on their skinning engine, they'd have a product Mac users, like me, would like to buy.

Monday, February 25, 2008

We Need a PDF replacement for PicComment

I direct your attention to the very explicitly obsoleted documentation for the QuickDraw routine PicComment. With a call to PicComment, an application would insert whatever arbitrary data it liked into a PICT. It could then put the PICT on the system clipboard. The user could paste it into their favorite word processor. Days, weeks, femtoseconds later, the user could copy the same image back into the clipboard, and paste it back into the original application.


Without a PicComment, there was little the originating application could do, just treat the image as an uneditable graphic, floating lifelessly inside a document. With a PicComment, the application could retrieve enough information to recreate the original selection from which it came; perfectly without loss of quality. Users built up ad hoc workflows around the knowledge they didn't have to save original image documents, but could copy and paste images back from their word processor if they wished to revise.


With the gradual withering away of PICT support in OS X applications, it's well past time to look at getting round-trip clipboard support back. I hope Apple is thinking about this. The obvious solution is to:
  • Extend the keys available to CGPDFContextCreate and other similar PDF creating APIs to include private vendor data. Presumably, you could provide a dictionary under a kCGPDFContextVendorPrivateData key, with a kCGPDFContextVendorID (like "com.genhelp.mygreatapp") and an XML fragment—or other convenient format—needed to recreate the original.
  • Encourage application developers to preserve original PDF data whether received via file insertion, pasting or dragging. This would include Apple's own applications such as Preview and Text Edit. Often, applications will thoughtlessly re-render an image into PDF even when the source was PDF to begin with, heedlessly throwing away the original's auxiliary data.
  • Add similar functionality to the TIFF flavor favored by image editing software.


Presumably, the host application wouldn't even need to save the whole original, just the contents of the vendor dictionary.

And users can leave the world where a copy/paste is a one way trip.

Thursday, January 24, 2008

Microsoft Office Supports PDF on the Clipboard, And Why That is a Big Deal

For months, I've wanted to know whether Microsoft Office 2008 supported copying and pasting PDF data from the OS X clipboard. I couldn't find out, and it's not like I didn't ask. Two days ago, Office 2008 appeared on my chair, and the answer is yes.

Backing up a moment. When you copy content in one application and paste it in another, you are using a system service to transfer the data be it via the old style Carbon Clipboard manager, the Cocoa NSPasteboard class, or the newish Pasteboard framework available to the non-Objective C crowd. The two applications must agree on the format of the data exchanged, so typically only widespread standards are used. For text, as I've outlined before the RTF format is preferred. For bitmap images, a good choice is lossless TIFF. Vectored images, however, were a quandary.

Vectored images are pictures composed of individual drawing operations such as MoveTo, LineTo, AddToPath, FillPath, etc.. Because they are not limited in resolution like a bitmap, they look good on screen and tend to print out with lovely crisp lines. They also tend to be smaller than bitmap files. Every application on Classic Mac OS used a convenient format called PICT which is basically a recording of the QuickDraw operations needed to generate the onscreen display. PICT is a primitive format, something more in tune with the computers of 1984 than 2008. Off the top of my head, it lacks fractional coordinates, paths, Bezier curves, gradient fills, pagination, is limited to QuickDraw fonts, has limited (to rotating text) coordinate matrix manipulations, poor support in Cocoa applications and its ugly. The only two good things I can say about it is it does allow for high quality printing via embedded PostScript, and you can squirrel away your own data in it in case the same PICT gets copied and pasted back into your application.

When OS X arrived, legacy Carbon applications kept on generating their PICT clipboards for both bitmap and vectored material even though the superior PDF format was available and universally used by newer Cocoa applications. QuickDraw became obsolete and onscreen drawing is most often done with Quartz calls, and yet applications still maintain ways to generate PICT clipboards at great expense of maintenance and design. Why?

Because Microsoft Office didn't support PDF, and if you want to sell business applications on the Mac, you have to share data with Office, and that content had better print nicely from within Office. I know from personal experience the aggravation of maintaining the portion of an application which renders content into QuickDraw PICTs; ugly, cludgy QuickDraw PICTs when I could be easily generating PDF clipboards; beautiful lightweight, lithe, PDF files.

To illustrate what I mean I created this pdf in an application which supports creating EPS files but not putting PDF on the clipboard. I opened it in the OS X Preview application (a Cocoa App):

Copied the image into TextEdit (another Cocoa App) from Preview:

TextEdit has a bug where it doesn't re-render embedded PDFs when it zooms.
Copied the image from TextEdit and pasted it back into Preview and zoomed in on a detail:

Now compare with a zoomed detail when using the PICT version from the clipboard of the original application (a Carbon app) (ignore the checkerboard) pasted into Preview:


Presumably, OS X could provide a service where it would extract embedded PostScript from PICTs (if available), and generate a pleasing PDF pasteboard, but it doesn't and I doubt that Apple wants to encourage developers to keep on using PICT.


Getting back to the big news, there was Office 2008 on my chair. Install. Draw a moon:

Copy. Launch Pasteboard Peeker and see this output (... means omitted content):
PasteboardRef: 1116096 ItemCount: 1
Index: 1 item ID: 1112493904
...
"com.adobe.pdf"
"Apple PDF pasteboard type"
'PDF ' P_____ 21056 PDF-1.3 4 0 obj << /Length 5 0 R /Filter /FlateDecode >> stream x

"com.apple.pict"
"Apple PICT pasteboard type"
'PICT' P_____ 409198 >n C 0 H
...


Yay.

And notice how svelte the PDF (21,056 bytes) is compared to the PICT (409,198 bytes). Rendering a gradient fill in QuickDraw is not pretty.

Go back to Word and add a star:

Copy and paste into Pasteboard Peeker and:
PasteboardRef: 1116096 ItemCount: 1
Index: 1 item ID: 1112493904
...
"com.adobe.pdf"
"Apple PDF pasteboard type"
'PDF ' P_____ 24935 PDF-1.3 4 0 obj << /Length 5 0 R /Filter /FlateDecode >> stream x

"com.apple.pict"
"Apple PICT pasteboard type"
'PICT' P_____ 498658 H
...
The PICT version bloats by 97K while the PDF gets a mere 4K. Not that size matters any more with RAM and hard prices the way they are.


Word in Office 2004 had a clipboard which looked like this:
PasteboardRef: 3323920 ItemCount: 1
Index: 1 item ID: 1112493904
...
"com.apple.pict"
"Apple PICT pasteboard type"
'PICT' P_____ 2222 U , , f
...


Open up the pasted PDF in Preview and Zoom:

Look at that beautiful shadow detail! Try to do that in a PICT!

Draw something in Sketch copy paste, yep, there it is, a PDF pasted into Word. Yay again.

But what does this all mean? It means that once Office 2008 sees widespread adoption, the rest of the Mac content creation software industry can rip out every last QuickDraw call in their application. It means we can build 64-bit versions of our applications. It means we had best start putting PDF on our own clipboards. It means Cocoa apps can generate content and with no extra effort have it look great inside Office apps. It means there will be a new, higher minimum quality for interchanged content. It means we can forget everything we ever knew about Classic Mac programming.

The future is finally here.
[Update: One fly in the ointment is that Word has a bug wherein if you paste a PDF graphic into a Word document, and subsequently copy and paste it from Word and into another application (such as Preview), it loses its vectored quality:


But while the graphic is still within Word, it scales, prints, and zooms beautifully, so presumably this is just a bug in the copy code and not a design flaw. The vectored PDF is being maintained internally in some vectored form.]

Tuesday, January 22, 2008

Space Heaters: Honeywell HZ-519 versus DeLonghi HHP 1500

I have an online account with Consumer Reports, which I've been quite pleased with; it's the first place I look for home appliance recommendations. The manufacturers of my washer, dryer, wet vac, space heaters, and lawn mower can all thank Consumer Reports for my purchase. However, sometimes, by sticking to objective measures, they get things wrong when problems crop up outside those objective measures. In my case, I feel their high recommendation of the Honeywell HZ-519 electric convection space heater was misguided, as compared to another heater they gave a not quite as high rating, the DeLonghi HHP 1500.



Problem 1: Physical Dimensions

You might notice that the HZ-519 is long and thin. Long and thin objects are delicate and require odd shipping boxes. As I, and at least one other Amazon reviewer found out, these things are easily bent in the middle during shipping as it is hard to protect, leaving me with some ugly and noisy bends in the metal. (The Amazon merchant did give me $24 back instead of taking the return). Also, while it has ample cord length, it is mounted at the right side, if your plugin happens to be to the left, the cord suddenly is no longer so long. Compare this with the squarish, solid HHP 1500 which fits in a box of normal dimensions, and whose cord can be used at a fair distance to the left and right.

Problem 2: Temperature Control

Consumer reports gives the HZ-519 high marks for its digital temperature control and timer, which are nice for a single use: set the temperature to maintain by pushing a few buttons, set the time to heat through a few more button presses. The HHP 1500 on the other hand has no way to set a temperature, you turn some knobs and it will try maintain a temperature: what temperature that is it doesn't say. You will have to futz with it over the course of a couple nights until you find a setting which is comfortable for you. But, and here is the big but, once you find a setting you like you are done. When you want to use the DeLonghi you come in flip the top knob on; with the Honeywell, you press the power button, press the "Temp/Timer" button to select temperature, hit a couple arrow keys; if you want automatically turn off the unit, that's more button presses. Every single session. There is no memory of the last setting. Yes, you do get a timer, which is nice, but you try setting the temperature on this thing in a darkened room, when all you want to do is crawl in bed. (There are versions of this heater with a display backlight, and with a remote control which would mitigate this annoyance.)





Problem 3: Being Nice to Your Fuses

The HHP 1500 has two main power settings. One draws about 7Amps when powering its element, the other around 13 Amps, which spikes at over 15 when turning on causing my Kill-a-Watt to make a warning beep. The HZ-519 has one mode which draws around 11 Amps. A typical household circuit is rated at 15Amps. Now imagine having two bedrooms which share an electrical circuit, each with their own space heater. If you installed two HHP 1500s, that's fine, just keep both of them at their 7 Amp setting, and it will just take longer to heat the rooms. If you have two HZ-519s, pop goes the circuit breaker. This assumes the rooms in question are small enough to be heated adequately at the lower setting.


Problem 4: Noise

OK, I shouldn't criticize the HZ-519 because mine has a lot of twisted metal from the shipping incident, but it is quite noisy as it expands and contracts. The HHP 1500 is dead quiet.

Problem 5: Flexibility of Placement

The HZ-519 is designed to be placed along a wall, while the HHP 1500 can be mounted on a wall or rolled into the middle of a room.

Energy Usage

Not a big problem here, and impossible for me to compare. The HZ-519 in one bedroom has been using about 6kWH (about a dollar) a night keeping the room at a comfy temp over the chilly temp I keep the rest of the house, while the HHP 1500 has been draining about the same. I would be happy with nice thick blankets, but my wife wants the children warm when they kick off their covers.

Safety

The HZ-519 does have such additional safety features as automatic switch-off when knocked over. You will have to judge how important this is for you.

In Conclusion

If you are in the market for an electric convection space heater, get the DeLonghi. Buy them in the summer when they are cheap, because they have really gone up in price since the cold weather set in. I paid $80 for a DeLonghi in late November, it's now mid-January and the same heater is $160.
[Updated with additional commentary after first posting]

Tuesday, January 01, 2008

Memories of RAM past

Pretending 1GB equals 1000 MB.


This is in the category of things that happen to everyone, but which should me remarked upon. Modern RAM capacities are amazing.


My first computer, a Mac Plus, purchased new from the University of North Dakota's bookstore, had .001 GB of RAM installed, which I upgraded first to .0025 GB, and then to its maximum .004 GB, at a cost, as I recall of about $150,000/GB.


I just purchased, at $15/GB (shipping included) a 2GB module for my MacBook, bringing it up to 3GB—and requiring me to dispose of an inconvenient 1GB module. So this means that my computer of today has 3000× the RAM of my 1988 computer, while the price per unit has dropped by 9,999/10,000ths of what it once was. Imagine if cars were improving at the same rate.


Here's a chart of the final RAM capacities for all my previous computers. I've long meant to make up this chart, so bear with the irrelevance to your life:


Drive capacities have gone through a similar transition, from the .0008 GB floppy in my Plus to the 200 GB drive in my MacBook.


The scary thing is the idea that over the next 20 years from now, RAM might increase in typical capacity by another 3000 times, and what will we be doing with it?

[Update: For whatever reason, my MacBook does not like the Transcend TS256MSQ64V6U module in combination with any other module I have. It works by itself, but put in either of the pre-existing 1GB modules (or even a 256MB module from a Mac Mini, and it would not boot. So, I'm stuck at a mere 2GB.]