Unfortunately, it's started leaking and in tracking down the leak, I've realized how broadly applicable the skills we pick up debugging software can be applied to the physical world.
A blog about iPhones, iPads, Swift, being a working coder, gadgets and tech. The official blog of Generally Helpful Software.
Wednesday, October 29, 2008
Debugging a Shower
The big extravagance at the house this year is replacing the cheapest looking vinyl shower you ever saw with a very nice tile and glass block version. A friend of my wife's built it over the course of a month; it's beautiful. The babies have been learning the joys of showers over baths, and I've been spending much of my time pondering development problems from the built in granite seat.
Unfortunately, it's started leaking and in tracking down the leak, I've realized how broadly applicable the skills we pick up debugging software can be applied to the physical world.
Unfortunately, it's started leaking and in tracking down the leak, I've realized how broadly applicable the skills we pick up debugging software can be applied to the physical world.
Saturday, October 25, 2008
Visual Studio 2005 C++ and the ? : operator
One thing you want in C++ compilers is consistency with other C++ compilers. Especially if you are doing cross-platform development work. I want the VS C++ compiler to interpret what I tell it to do exactly the same way that gcc interprets it. That way, I don't have to special case my compilations, and I don't create a bunch of platform specific bugs in my platform neutral code. And usually, the two compilers do agree.
However, I discovered today (Friday) that if you compile something like:
double x = (false) ? 0 : 1.4;
that x will not equal 1.4 as most people (and the gcc compiler) would think, but rather it will equal 1.0. Why? Because it sees 0, interprets it as an integer and decides that if both halves of the : have to have the same type, then that type will be integer. The fact that this is in the middle of an assignment to double means nothing.
In order for x to equal the expected 1.4, you have to write:
double x = (false) ? 0.0 : 1.4;
I'm not even saying that Visual Studio is wrong. It's different from gcc which leads to platform specific bugs, and it's unexpected so the unaware coder will type in the unwanted form and not know something could possibly go wrong.
[Update: Russell Finn goes into more language lawyer parsing of this, and believes VS C++ is wrong. (I don't know Mr. Finn, I just like to read my Site Meter page to see how people get to my blog.)]
[Update 2: And don't let me get started about how VS C++ allows you to put the closing angle brackets of a nested template declaration without a space, as in:
vector<pair<string, int>> myVectorOfPairs;
]
However, I discovered today (Friday) that if you compile something like:
double x = (false) ? 0 : 1.4;
that x will not equal 1.4 as most people (and the gcc compiler) would think, but rather it will equal 1.0. Why? Because it sees 0, interprets it as an integer and decides that if both halves of the : have to have the same type, then that type will be integer. The fact that this is in the middle of an assignment to double means nothing.
In order for x to equal the expected 1.4, you have to write:
double x = (false) ? 0.0 : 1.4;
I'm not even saying that Visual Studio is wrong. It's different from gcc which leads to platform specific bugs, and it's unexpected so the unaware coder will type in the unwanted form and not know something could possibly go wrong.
[Update: Russell Finn goes into more language lawyer parsing of this, and believes VS C++ is wrong. (I don't know Mr. Finn, I just like to read my Site Meter page to see how people get to my blog.)]
[Update 2: And don't let me get started about how VS C++ allows you to put the closing angle brackets of a nested template declaration without a space, as in:
vector<pair<string, int>> myVectorOfPairs;
]
Thursday, October 23, 2008
News of the Not News: People Like Free iPhone Apps
This is one of those blog posts where the author has two data points and has to write an entire entry about those points. Here is the data: My iPhone app to help HDHomerun users align their antennas, Signal GH, went 5 days without a sale. This despite getting a favorable mention on a well known podcast on Friday. So I temporarily started giving it away to boost exposure and reviews. A day later, I had 205 downloads from the U.S..
So $2.99 equals 0 unit sales per day == $0.00 profit
And $0.00 equals 205 unit sales per day == $0.00 profit
At least I'm breaking even.
I believe a large number of these downloads are pointless, as you have to have an HDHomerun to do anything at all with Signal GH. There are people who troll the back pages of the utilities section of the App Store, downloading everything free. I just hope these people don't write reviews.
So $2.99 equals 0 unit sales per day == $0.00 profit
And $0.00 equals 205 unit sales per day == $0.00 profit
At least I'm breaking even.
I believe a large number of these downloads are pointless, as you have to have an HDHomerun to do anything at all with Signal GH. There are people who troll the back pages of the utilities section of the App Store, downloading everything free. I just hope these people don't write reviews.
Thursday, October 16, 2008
Thankful to be a Cocoa Programmer
The company I work for is interested in Microsoft's .NET framework; I am not. But, I was scrounging for free content on iTunes the other day and came across the .NET Rocks podcast, and out of idle curiosity downloaded a few episodes. They are well done, and put together by friendly people with an unflagging enthusiasm for Microsoft technologies. And they make me very happy to spend much—unfortunately not all—of my time working on Macs and iPhones.
Apparently, the target audience is comprised of developers putting together custom business applications; the kind of vertical apps corporate America (and apparently Dubai) consume by the megalines of code. Not the general purpose, high degree of finish, large user base applications I've spent much of my career writing, but database frontends and the like. It's the Bizarro world out there, where right justifying labels is considered a major advance in GUI development.
Any iPhone developers would be well served listening to the recent cast on Windows Mobile. As someone who was scrounging for work when the Mac had 3% market share, I can sympathize with the pathos of a product manager for Windows Mobile trying to put a brave face on disaster, but come on. This exchange pretty much sums up the level of wishfulness and straw grasping:
Host: So let me ask the question a different way: the next version, whatever it's called; should Apple be scared?
Rob Tiffany: Very scared.
Hosts: (laughs) YES!
Rob Tiffany: Very scared.
Hosts: I knew it.
Rob Tiffany: Yeah, yeah, we're working on some secret sauce out there.
Host: Not too secret anymore!
I will congratulate the hosts for not being blind to WinCE's current flaws: they gave Rob a hard time about how the phone app on their phone was glacially slow; I'm just amazed they think it will get better. That old saying about a second marriage being the triumph of hope over experience..
And the episode on complying with the corporate governance rules of the Sarbanes Oxley law... If I had to do that sort of thing, I'd seriously consider going to work at Home Depot. How does one show up at work every day doing that sort of thing?
And that's the thing. I'm nearly always happy either going into work—assuming I won't be spending the whole day fixing OLE bugs on the PC—or pulling out my MacBook and add a refinement to an iPhone app. Life is sweet. I get to work in an application framework which was designed right from the start; light weight and powerfully elegant. I'm not one of a hundred cogs living in a condo in Dubai; I'm a sole practitioner, or an unblended part of a small team. I write software people don't have to be paid to use. I don't have to wait for the secret sauce which never comes. I am a Cocoa programmer, and for that I am thankful.
Apparently, the target audience is comprised of developers putting together custom business applications; the kind of vertical apps corporate America (and apparently Dubai) consume by the megalines of code. Not the general purpose, high degree of finish, large user base applications I've spent much of my career writing, but database frontends and the like. It's the Bizarro world out there, where right justifying labels is considered a major advance in GUI development.
Any iPhone developers would be well served listening to the recent cast on Windows Mobile. As someone who was scrounging for work when the Mac had 3% market share, I can sympathize with the pathos of a product manager for Windows Mobile trying to put a brave face on disaster, but come on. This exchange pretty much sums up the level of wishfulness and straw grasping:
Host: So let me ask the question a different way: the next version, whatever it's called; should Apple be scared?
Rob Tiffany: Very scared.
Hosts: (laughs) YES!
Rob Tiffany: Very scared.
Hosts: I knew it.
Rob Tiffany: Yeah, yeah, we're working on some secret sauce out there.
Host: Not too secret anymore!
I will congratulate the hosts for not being blind to WinCE's current flaws: they gave Rob a hard time about how the phone app on their phone was glacially slow; I'm just amazed they think it will get better. That old saying about a second marriage being the triumph of hope over experience..
And the episode on complying with the corporate governance rules of the Sarbanes Oxley law... If I had to do that sort of thing, I'd seriously consider going to work at Home Depot. How does one show up at work every day doing that sort of thing?
And that's the thing. I'm nearly always happy either going into work—assuming I won't be spending the whole day fixing OLE bugs on the PC—or pulling out my MacBook and add a refinement to an iPhone app. Life is sweet. I get to work in an application framework which was designed right from the start; light weight and powerfully elegant. I'm not one of a hundred cogs living in a condo in Dubai; I'm a sole practitioner, or an unblended part of a small team. I write software people don't have to be paid to use. I don't have to wait for the secret sauce which never comes. I am a Cocoa programmer, and for that I am thankful.
Saturday, October 11, 2008
The Loss of Homogeneity: iPhones on the Network
Developing for the iPhone and iPod Touch is nice: limited hardware differences leads to reliability. If it works on my iPhone running the 2.1 OS, it should run well on your iPhone. They are pretty much the same. Homogeneity is good when it comes to testing for program errors.
So your iPhone is the same as mine, but your network is not.
I submitted a new version of Remote Remote GH for OS X Touch to Apple a week ago. It had been gradually gestating and stabilizing has I had time to improve it. The last several weeks, it had been quite stable, and had never been much of a crasher to begin with. Thursday night, Apple notified me that the new version was available for distribution on the App Store; Friday morning RRgh crashed on me on launch. Wha?
Backing up. I had been playing with the XBox Media Center (XBMC) for the AppleTV, and last night I had tried turning on its UPnP (Universal Plug and Play) server and client. The new RRgh's big feature is support for auto-detecting MythTV frontends via UPnP, so having another UPnP device on my network meant my code had to check to see it wasn't a MythTV. Unfortunately, this revealed a latent crashing bug where I was expecting a string, got a NULL pointer instead, and boom, boom crash. (No actual audio percussion, just a silent quit.)
An early updater experiencing (what I assume is) the same problem figured out my e-mail address and let me know that I wasn't the only one.
So, I had to quickly remove RRgh from distribution pending a 1.1.1 release, as I have no wish to get a deluge of negative reviews from whatever small fraction of the populace would be affected by this. Which means, there is no RRgh currently available for download, as I don't have the option to revert to the 1.0 binary.
If you are one of the people who downloaded 1.1, I hope it isn't crashing on you, but I've submitted an updated binary which should be out within the week.
So your iPhone is the same as mine, but your network is not.
I submitted a new version of Remote Remote GH for OS X Touch to Apple a week ago. It had been gradually gestating and stabilizing has I had time to improve it. The last several weeks, it had been quite stable, and had never been much of a crasher to begin with. Thursday night, Apple notified me that the new version was available for distribution on the App Store; Friday morning RRgh crashed on me on launch. Wha?
Backing up. I had been playing with the XBox Media Center (XBMC) for the AppleTV, and last night I had tried turning on its UPnP (Universal Plug and Play) server and client. The new RRgh's big feature is support for auto-detecting MythTV frontends via UPnP, so having another UPnP device on my network meant my code had to check to see it wasn't a MythTV. Unfortunately, this revealed a latent crashing bug where I was expecting a string, got a NULL pointer instead, and boom, boom crash. (No actual audio percussion, just a silent quit.)
An early updater experiencing (what I assume is) the same problem figured out my e-mail address and let me know that I wasn't the only one.
So, I had to quickly remove RRgh from distribution pending a 1.1.1 release, as I have no wish to get a deluge of negative reviews from whatever small fraction of the populace would be affected by this. Which means, there is no RRgh currently available for download, as I don't have the option to revert to the 1.0 binary.
If you are one of the people who downloaded 1.1, I hope it isn't crashing on you, but I've submitted an updated binary which should be out within the week.
Labels:
bugs,
iPhone Development
Thursday, October 02, 2008
News of the Not News: QuickDraw Printing is Dead
As I occasionally let on, my day job is the care and feeding of an ancient, semi-well known, Carbon application (and worse its Win32 doppleganger, and worse still its Win32 doppleganger's OLE interface). One "perk" is the occasional confrontation with the living dead. The dead in question is how we print. We print using a very old and elaborate system of QuickDraw calls spiked with embedded Postscript. This creates very nice output. On Postscript printers. From PowerPC Macs.
For Intel Macs, not so good. Why? Because there is no embedded Postscript in Quickdraw printing support for Intel Macs. It didn't make the cut of creaky technologies Apple bothered porting over during the Intel transition. So, yes, text and polygons look OK, if imprecisely positioned, and that's most of our output; but anything splined based is bad as there are no splines in QuickDraw only very big polygons.
So, users of ancient Carbon applications: check out the printing on your shiny new Intel box.
For Intel Macs, not so good. Why? Because there is no embedded Postscript in Quickdraw printing support for Intel Macs. It didn't make the cut of creaky technologies Apple bothered porting over during the Intel transition. So, yes, text and polygons look OK, if imprecisely positioned, and that's most of our output; but anything splined based is bad as there are no splines in QuickDraw only very big polygons.
So, users of ancient Carbon applications: check out the printing on your shiny new Intel box.
Tuesday, September 23, 2008
How I Didn't Get Rich Via my iPhone App
I think we've all seen the stories about some guy in New Zealand making $6K a day for a Choplifter clone, or another fellow making $250K in two months reimagining Tetris. What you don't hear about are the flops. And I have a flop on my hands. My application to monitor over the air digital television signal quality, Signal GH has sold 10 copies. I had had no illusions that it would be a big seller; the target market is small at the intersection of iPhone owners and HDHomerun users; I just assumed there were a few thousand such people and they would all want my app.
Apparently not.
Unlike a lot of apps getting shoveled onto the App Store, Signal is competently coded; performs a useful service, launches quickly, doesn't leak, doesn't crash, and has only a smattering of bugs. Maybe prospective buyers would like to know this, but I have had no reviews so far.
Part of the problem was that Apple mistakenly lists its release date as the day I submitted it to them for testing, instead of the day they actually released it to the public, so I got no time on the front page of the Utilities page. And part is probably too high a price point, maybe $5 is just too much even for niche market apps. Again, I'm flying in the dark here because I've gotten zero feedback.
I also don't know where HDHomerun users hang out. The forums on Silicon Dust seem fairly low volume. Maybe if the HDHomerun were buggier, people would be filling their forums but dang it if the HDHomerun isn't the most reliable gadget on my network. And, I don't want to make commercial announcements on public forums anyway; nobody likes that sort of thing. I'm not a marketing wiz, I am a fairly good Mac coder.
Oh well, I only wasted a couple weeks of spare coding time. I've learned my lessons about targeting larger markets. I learned a thing or two about iPhone development, and hopefully my next post on the subject will be "How I Did Get Rich Via My iPhone App."
Apparently not.
Unlike a lot of apps getting shoveled onto the App Store, Signal is competently coded; performs a useful service, launches quickly, doesn't leak, doesn't crash, and has only a smattering of bugs. Maybe prospective buyers would like to know this, but I have had no reviews so far.
Part of the problem was that Apple mistakenly lists its release date as the day I submitted it to them for testing, instead of the day they actually released it to the public, so I got no time on the front page of the Utilities page. And part is probably too high a price point, maybe $5 is just too much even for niche market apps. Again, I'm flying in the dark here because I've gotten zero feedback.
I also don't know where HDHomerun users hang out. The forums on Silicon Dust seem fairly low volume. Maybe if the HDHomerun were buggier, people would be filling their forums but dang it if the HDHomerun isn't the most reliable gadget on my network. And, I don't want to make commercial announcements on public forums anyway; nobody likes that sort of thing. I'm not a marketing wiz, I am a fairly good Mac coder.
Oh well, I only wasted a couple weeks of spare coding time. I've learned my lessons about targeting larger markets. I learned a thing or two about iPhone development, and hopefully my next post on the subject will be "How I Did Get Rich Via My iPhone App."
UPnP with Cocoa
Needing a license compatible implementation of UPnP I could use to locate MythTV frontends, I made the mistake of using the C++ version of CyberGarage's library. This was a mistake as the C++ version (1.7) is old and filled with bugs and leaks. The proper version to use is the C version (2.2) which is much less buggy and even comes with an Objective C wrapper class.
If you are writing an iPhone app and need to locate a UPnP device, this is probably the way to go.
Oh, and one thing to consider is that if you look through the code, you will see that the library avoids opening up connections with network interfaces that have the IFF_LOOPBACK flag set—for obvious reasons. You might want to also avoid network interfaces with the IFF_POINTOPOINT flag set, as that is as like as not the cell network radio, and you probably don't want to make UPnP inquiries over the cell network.
[Update: Looks like TCM Port Mapper might be the preferred library, but I haven't tried it.
If you are writing an iPhone app and need to locate a UPnP device, this is probably the way to go.
Oh, and one thing to consider is that if you look through the code, you will see that the library avoids opening up connections with network interfaces that have the IFF_LOOPBACK flag set—for obvious reasons. You might want to also avoid network interfaces with the IFF_POINTOPOINT flag set, as that is as like as not the cell network radio, and you probably don't want to make UPnP inquiries over the cell network.
[Update: Looks like TCM Port Mapper might be the preferred library, but I haven't tried it.
Labels:
iPhone Development,
UPNP
Wednesday, September 10, 2008
iTunes 8 is still not Cocoa
Earlier in the year, I posted a blog entry about how Apple, obviously, must be getting ready to release a new version of iTunes based upon the Cocoa framework, and not the Carbon framework. While I still believe this to be true, iTunes 8 is not that product.
A quick look through the application's bundle shows all the signs of a product still deeply welded to Carbon.
A quick glance through the Info.plist:
There are a few .nib files in the bundle, but all the ones I looked at were Carbon.
Localized strings are still kept in a resource (.rsrc) file! What in the world.
In fact, there are no obvious signs of any Cocoa use at all.
As someone who's day job involves maintaining a Carbon app that "should" be transitioned to Cocoa, I can sympathize. But it is concerning as Apple wastes development time adding features like Genius to a straight Carbon application. Anybody writing new features will know that their work will have a shelf life of about 9 months; and will have to be reimplemented—at least the GUI parts—in Cocoa. And the sooner a Cocoa iTunes comes out, the faster it will become a mature, bug-free product.
A quick look through the application's bundle shows all the signs of a product still deeply welded to Carbon.
A quick glance through the Info.plist:
- No Principle Class so no NSApplication at launch
- The HIWindowFlushAtFullRefreshRate flag is set indicating at least a few Carbon windows
- Application requires Carbon environment flag is set
There are a few .nib files in the bundle, but all the ones I looked at were Carbon.
Localized strings are still kept in a resource (.rsrc) file! What in the world.
In fact, there are no obvious signs of any Cocoa use at all.
As someone who's day job involves maintaining a Carbon app that "should" be transitioned to Cocoa, I can sympathize. But it is concerning as Apple wastes development time adding features like Genius to a straight Carbon application. Anybody writing new features will know that their work will have a shelf life of about 9 months; and will have to be reimplemented—at least the GUI parts—in Cocoa. And the sooner a Cocoa iTunes comes out, the faster it will become a mature, bug-free product.
Thursday, September 04, 2008
Revamping a Dated Cocoa Application: InCDius 2.5
iPhone development has re-energized my love of programming. My day job requires me to do too much Win32 coding, and that is soul deadening for a Cocoa programmer. The combination of corporate priorities, and the fact I can add features much more quickly and with fewer bugs in Cocoa, leads me to spend a huge fraction of my time in Visual Studio, when I'd rather be in XCode. It's been draining.
I released InCDius 2 back in 2002, and it never got out of beta, although beta 16 was quite stable, and I apparently have a number of loyal (yet put upon) users. It is just an Audio CD database, not a personal media database like the well regarded Delicious Library. It was fast, and pretty darn stable, although a few users have had database corruption issues—please back up to XML people. I knew for years that I should release a new version, but could never quite muster the energy. It embarrassed me having this old, non-Intel application, gradually getting less stable with each OS release application; and I was considering my options for killing it entirely.
But my Remote Remote GH for iPhone application makes heavy use of the SQLite database, so I felt confident I could transition InCDius away from the Berkeley DB database. This transition had been the most daunting of the reasons keeping me from releasing an update.

My first impulse was to refactor the whole application using Core Data, but that would not have added anything but development time; the GUI was wired up quite well; the application was already scriptable. All I needed was a new database backend, and Apple ships OS X 10.5 with SQLite 3 right in /usr/lib. By choosing SQLite, I was no longer responsible for compiling and upgrading my database library; I can rely on it being there and just working. I tried using the QuickLite SQLite wrapper, but it was not capable for the task, as it creates a large number of objects maintaining its internal cache, to the point of locking up my Mac. No, I had to call the SQLite C API directly. As long as I minimize random access into the database, I get good results.
I also wanted this to be a release that could live on its own for a very long time; who knows how long I'll go without finding the time to update it. I wanted code which would last a very long time. This meant not only transitioning to Intel, but compiling for Intel 64-bit, and removing every deprecated API or framework message I could root out. I decided to make the minimum OS X version 10.5 (Leopard), and to transition the codebase to Objective C 2.0, although it isn't quite all there yet. I'll have to work on the garbage collection, but I am using both Objective C 2.0 fast enumeration loops and properties where appropriate.
Forward thinking means removing the past, so out went a few features tied to old APIs. I had been allowing users to import information from Classic Mac OS's CD Player preference file, but that involved using old school Resource Manager calls. Anybody ever going to use this feature again? No. Then out it goes, along with the Resource Manager.
I had 2 separate ways to play Audio CD tracks. One used QuickTime; one used Core Audio. Kept the Core Audio.
NSTableView has a new (to me) way to deal with selections with NSIndexSets, and deprecated the messages I had had been using. Goodbye old messages.
My own Objective C objects were filled with ints, unsigned ints, longs, and even the occasional unsigned. Hello NSInteger, NSUInteger and 64 bit computing.
I had been using a custom version of NSTableView to draw blue stripes. Goodbye custom class, hello check box in Interface Builder and grey stripes.
Obviously, I had been using .nib files for building my GUI. Time to upgrade everything to .xib files.
A user sent me his database (as a .zip archive of XML files) of 15,000 individual disks. Wow, and I had been thinking InCDius was zippy fast and ready for release. Massive performance tuning and the decision to load the database at runtime instead of upon the first search. Much simpler and reliable, and lets the user do something else while everything gets ready.
Take time to remove all the warning messages. You will save yourself a huge amount of hassle in programming, but especially in Cocoa programming if you eliminate all the compile warnings. You want to know right away if the compiler doesn't know if NSView responds to "releese". Do not let warnings build up. Fix them. Fix them all.
A Google Tech Talk by Linus Torvalds inspired me to change the unique identifier key I had been using to lookup disks in the database; he uses SHA1 hashes to avoid corruption in Git, and it seemed to me that a SHA1 hash of the Audio CD's Table of Contents would be as close as I could get to being unique; although there will be rare instances when 2 CDs cannot both be in the database.
So after all this, what do I have? A fast, reliable(?), clean, focused little application, which I can send out into the world; spend the next few months fixing the occasional bug and adding the occasional feature, and which people can use for a good long while.
I released InCDius 2 back in 2002, and it never got out of beta, although beta 16 was quite stable, and I apparently have a number of loyal (yet put upon) users. It is just an Audio CD database, not a personal media database like the well regarded Delicious Library. It was fast, and pretty darn stable, although a few users have had database corruption issues—please back up to XML people. I knew for years that I should release a new version, but could never quite muster the energy. It embarrassed me having this old, non-Intel application, gradually getting less stable with each OS release application; and I was considering my options for killing it entirely.
But my Remote Remote GH for iPhone application makes heavy use of the SQLite database, so I felt confident I could transition InCDius away from the Berkeley DB database. This transition had been the most daunting of the reasons keeping me from releasing an update.
My first impulse was to refactor the whole application using Core Data, but that would not have added anything but development time; the GUI was wired up quite well; the application was already scriptable. All I needed was a new database backend, and Apple ships OS X 10.5 with SQLite 3 right in /usr/lib. By choosing SQLite, I was no longer responsible for compiling and upgrading my database library; I can rely on it being there and just working. I tried using the QuickLite SQLite wrapper, but it was not capable for the task, as it creates a large number of objects maintaining its internal cache, to the point of locking up my Mac. No, I had to call the SQLite C API directly. As long as I minimize random access into the database, I get good results.
I also wanted this to be a release that could live on its own for a very long time; who knows how long I'll go without finding the time to update it. I wanted code which would last a very long time. This meant not only transitioning to Intel, but compiling for Intel 64-bit, and removing every deprecated API or framework message I could root out. I decided to make the minimum OS X version 10.5 (Leopard), and to transition the codebase to Objective C 2.0, although it isn't quite all there yet. I'll have to work on the garbage collection, but I am using both Objective C 2.0 fast enumeration loops and properties where appropriate.
Forward thinking means removing the past, so out went a few features tied to old APIs. I had been allowing users to import information from Classic Mac OS's CD Player preference file, but that involved using old school Resource Manager calls. Anybody ever going to use this feature again? No. Then out it goes, along with the Resource Manager.
I had 2 separate ways to play Audio CD tracks. One used QuickTime; one used Core Audio. Kept the Core Audio.
NSTableView has a new (to me) way to deal with selections with NSIndexSets, and deprecated the messages I had had been using. Goodbye old messages.
My own Objective C objects were filled with ints, unsigned ints, longs, and even the occasional unsigned. Hello NSInteger, NSUInteger and 64 bit computing.
I had been using a custom version of NSTableView to draw blue stripes. Goodbye custom class, hello check box in Interface Builder and grey stripes.
Obviously, I had been using .nib files for building my GUI. Time to upgrade everything to .xib files.
A user sent me his database (as a .zip archive of XML files) of 15,000 individual disks. Wow, and I had been thinking InCDius was zippy fast and ready for release. Massive performance tuning and the decision to load the database at runtime instead of upon the first search. Much simpler and reliable, and lets the user do something else while everything gets ready.
Take time to remove all the warning messages. You will save yourself a huge amount of hassle in programming, but especially in Cocoa programming if you eliminate all the compile warnings. You want to know right away if the compiler doesn't know if NSView responds to "releese". Do not let warnings build up. Fix them. Fix them all.
A Google Tech Talk by Linus Torvalds inspired me to change the unique identifier key I had been using to lookup disks in the database; he uses SHA1 hashes to avoid corruption in Git, and it seemed to me that a SHA1 hash of the Audio CD's Table of Contents would be as close as I could get to being unique; although there will be rare instances when 2 CDs cannot both be in the database.
So after all this, what do I have? A fast, reliable(?), clean, focused little application, which I can send out into the world; spend the next few months fixing the occasional bug and adding the occasional feature, and which people can use for a good long while.
Labels:
Cocoa,
InCDius,
refactoring
Thursday, August 28, 2008
Signal GH - iPhone App for the HDHomerun
[Update: It's finally in the iTunes App Store.]
I submitted my second iPhone application to the App Store this morning. Signal GH is a utility to monitor the signal quality of over the air television for American and Canadian users of the HDHomerun. I'm charging $4.99 for it, which seems a fair price given what a small market I'm targeting: the intersection of iPhone and OTA HDHomerun users. Anybody who's set a retail price knows how hard it is. And I'm sure there will be a lot of people giving me one star and claiming it should be free, or one star and claiming they didn't understand you needed an HDHomerun. Sorry guys, gotta send my son to day care, the requirements are right in the first sentence of the blurb.
The Silicon Dust engineers did a great job with their libHDHomerun API, anybody wanting to put out a C API should check it out for its use of the language, platform ambivalence, and object like characteristics. Of course, they then released it under LGPL, making me wait the last couple weeks for them to modify the license. But they did, and I promptly submitted.
Now just step back and let the pennies roll in.
I submitted my second iPhone application to the App Store this morning. Signal GH is a utility to monitor the signal quality of over the air television for American and Canadian users of the HDHomerun. I'm charging $4.99 for it, which seems a fair price given what a small market I'm targeting: the intersection of iPhone and OTA HDHomerun users. Anybody who's set a retail price knows how hard it is. And I'm sure there will be a lot of people giving me one star and claiming it should be free, or one star and claiming they didn't understand you needed an HDHomerun. Sorry guys, gotta send my son to day care, the requirements are right in the first sentence of the blurb.
The Silicon Dust engineers did a great job with their libHDHomerun API, anybody wanting to put out a C API should check it out for its use of the language, platform ambivalence, and object like characteristics. Of course, they then released it under LGPL, making me wait the last couple weeks for them to modify the license. But they did, and I promptly submitted.
Now just step back and let the pennies roll in.
Labels:
antennas,
hdhomerun,
iPhone,
iPod Touch,
OTA
Saturday, August 23, 2008
Few USB Car Chargers work with iPhone 3G
[Update: an earlier version of this blog entry said that the 2A Griffin PowerJolt was able to charge my iPhone while playing audio and showing the GPS map. Additional testing showed this not to be the case.]
As I spend 4 hours a week driving my car to and from work, I need my podcasts to keep me sane. Before I bought an iPhone 3G last month, I had a very reliable cigarette lighter to FireWire cable hooked up to a tiny dock which I would plug into my reliable iPod Mini. Apple, to save space, removed Firewire charging on the iPhone 3G thus forcing users to go with the inferior USB style charging. In a quest to replace my previous setup, I found out the hard way how inferior USB charging is.
To replace my original micro dock, I purchased an Apple Component AV Cable. This gave me a USB connector for charging and 2 RCA cables to plug into the convenient RCA ports I had previously installed in my Civic. And to think I was mocked for using RCA instead of a mini jack. Of course, this left me with 3 useless component video connectors, but they were easily bundled up.

Now I needed a cigarette to USB adapter, so off to Amazon. I thought the Mace Group USB Charger looked compact.

And was soon happily zipping back and forth between work and home before discovering something: it wasn't actually charging my battery. In the meantime, I had purchased another copy for my wife's car, and the iPhone did not even recognize that has being connected. So back they went to Amazon.
Needing charger for a long day of taking a visitor around town, I picked up a PowerDuo/PowerJolt by Griffin at Circuit City:
That seemed to go alright for a couple days until I left the iPhone displaying a GPS map for half an hour. What's that smell? The smell of the PowerJolt overheating.
OK, might need something more heavy duty. My, this Black & Decker Power to Go cup holder charger looks the part.
Turns out it only delivers 350mA to the USB port. The iPhone won't even notice you are trying to charge it.
How about this Kensington Car Charger for iPhone and iPod rated at 1 Amp? It says right on the box for iPhone (no mention of 3G)

Well it didn't overheat, but it failed my listening to music with the GPS running test, couldn't quite keep up.
Finally, going back to another Griffin PowerJolt from Best Buy, this one larger and with a 2 Amp fuse.

Here we have awinner. [Correction: During a longer test, the iPhone's battery did not build charge while playing audio and displaying the GPS map, but the PowerJolt kept cool and was nearly able to keep up.] Driving around while listening to music and with the GPS on, it was actually able to top off my battery. And this model has the new, don't start smoldering feature.
It turns out that what any given USB port will deliver in terms of power varies widely: one charger might deliver 5 times as much current as another. I would recommend finding a charger with the highest available maximum amperage, something around 2A like the more recent PowerJolts, and avoiding dual port chargers unless they can guarantee a high current to both ports.
[Update: What is strange about this is that according to System Profiler, my iPhone lives on an allocation of 500mA while connected to my MacBook.]
As I spend 4 hours a week driving my car to and from work, I need my podcasts to keep me sane. Before I bought an iPhone 3G last month, I had a very reliable cigarette lighter to FireWire cable hooked up to a tiny dock which I would plug into my reliable iPod Mini. Apple, to save space, removed Firewire charging on the iPhone 3G thus forcing users to go with the inferior USB style charging. In a quest to replace my previous setup, I found out the hard way how inferior USB charging is.
To replace my original micro dock, I purchased an Apple Component AV Cable. This gave me a USB connector for charging and 2 RCA cables to plug into the convenient RCA ports I had previously installed in my Civic. And to think I was mocked for using RCA instead of a mini jack. Of course, this left me with 3 useless component video connectors, but they were easily bundled up.
Now I needed a cigarette to USB adapter, so off to Amazon. I thought the Mace Group USB Charger looked compact.
And was soon happily zipping back and forth between work and home before discovering something: it wasn't actually charging my battery. In the meantime, I had purchased another copy for my wife's car, and the iPhone did not even recognize that has being connected. So back they went to Amazon.
Needing charger for a long day of taking a visitor around town, I picked up a PowerDuo/PowerJolt by Griffin at Circuit City:
That seemed to go alright for a couple days until I left the iPhone displaying a GPS map for half an hour. What's that smell? The smell of the PowerJolt overheating.
OK, might need something more heavy duty. My, this Black & Decker Power to Go cup holder charger looks the part.
Turns out it only delivers 350mA to the USB port. The iPhone won't even notice you are trying to charge it.
How about this Kensington Car Charger for iPhone and iPod rated at 1 Amp? It says right on the box for iPhone (no mention of 3G)
Well it didn't overheat, but it failed my listening to music with the GPS running test, couldn't quite keep up.
Finally, going back to another Griffin PowerJolt from Best Buy, this one larger and with a 2 Amp fuse.
Here we have a
It turns out that what any given USB port will deliver in terms of power varies widely: one charger might deliver 5 times as much current as another. I would recommend finding a charger with the highest available maximum amperage, something around 2A like the more recent PowerJolts, and avoiding dual port chargers unless they can guarantee a high current to both ports.
[Update: What is strange about this is that according to System Profiler, my iPhone lives on an allocation of 500mA while connected to my MacBook.]
Thursday, July 24, 2008
Remote Remote for MythTV for iPhone
I've written a "toy app" for OS X Touch, a version of my Remote Remote GH for MythTV application which acts as a remote control over the wireless network to control a MythTV. It's a toy in the sense I am mainly using it to learn the iPhone SDK, and not to earn money, or even a large user base. I don't expect it will find a large following, I was just learning the ways of the iPhone.
Coding for the iPhone is different. Not so much in the GUI classes or the tools, but in the outlook it requires of the engineer. I was working in a resource starved environment; one in which I needed to take care not to run out of memory, not to use network unless needed, and not to spend a lot of time launching or quitting. An environment, where I had to justify every button on the remote control as space worthy. I leaned heavily on Apple's performance tools, much more than I would on a Mac. And, I refactored constantly. Every time I learned something new, I would change my code to work better and better fit the SDK way.
I also needed to use SQLite extensively, and felt compelled to adapt the QuickLite framework to both Objective-C 2.0 and the iPhone. If Tito Ciuro reads this, please send me an e-mail.
And finally, I'm quite proud of most of the artwork:

It's harder than it looks coming up with simple, thematically consistent, iconic graphics. Thanks to my wife for telling me try again after my original app icon design mashed together a picture of a widescreen TV with a remote drawn in perspective. I went with a simple rendering of navigation buttons, which I liked so much I made the buttons part of the actual GUI.
Coding for the iPhone is different. Not so much in the GUI classes or the tools, but in the outlook it requires of the engineer. I was working in a resource starved environment; one in which I needed to take care not to run out of memory, not to use network unless needed, and not to spend a lot of time launching or quitting. An environment, where I had to justify every button on the remote control as space worthy. I leaned heavily on Apple's performance tools, much more than I would on a Mac. And, I refactored constantly. Every time I learned something new, I would change my code to work better and better fit the SDK way.
I also needed to use SQLite extensively, and felt compelled to adapt the QuickLite framework to both Objective-C 2.0 and the iPhone. If Tito Ciuro reads this, please send me an e-mail.
And finally, I'm quite proud of most of the artwork:
It's harder than it looks coming up with simple, thematically consistent, iconic graphics. Thanks to my wife for telling me try again after my original app icon design mashed together a picture of a widescreen TV with a remote drawn in perspective. I went with a simple rendering of navigation buttons, which I liked so much I made the buttons part of the actual GUI.
Labels:
iPhone,
mythtv,
remote controls
Sunday, July 20, 2008
My Semi-Annual Rant about Compiling MythTV
So, I am testing a little iPhone app to control MythTV, and I decide I should upgrade to the current bleeding edge build. Which is bad enough, as I had Qt 3.3 installed on my Linux box and the bleeding edge is Qt 4 based; requiring figuring out what my PATH, QTDIR, QTMAKE, variables are required, and then ending up with an upgraded database making it hard to go back. And then finding out the the remote control interface is pretty severely broken in a couple places, which I either had to fix or cut back on the features of the app. And then finding out that the mythweb is pretty broken, and that live TV is not improved. Etc.
Bad enough, and then I had to upgrade the Mac frontend to match the new version of the MythTV protocol. As if it it never occurred to the guys at MythTV.org that standards should be flexible enough not to require constant changes. Ever notice that on top of every well formed XML file it says <?xml version="1.0" and that people have gotten along pretty well without changing the definition every 3.5 weeks. After a couple of hours of trying to get the frontend to compile on my MacBook, which has XCode 3.1, and an older version of Qt futzing things up I decided that the build scripts really didn't like llvm, and installed XCode 3.0 on the Mac Mini in the basement. Then it was a couple hours of fixing a few errors in the Mac specific code, and realizing the main configure file had to be made executable.
Then after it finally compiled, it was figuring out why I couldn't turn off pinging the database server (bad mysql.txt permissions), then why the frontend quit right away with the only clue being in the Console: "MythTV requires an updated schema". So the database schema had changed in the last 3 days, causing the frontend to quit without any warning. Classy.
And the playback is absolutely horrible on the Mini for some reason involving TOSLink passthrough or problems with linking in a library—I'm not sure. I really ought to buy another Mini, one which I can run as 100% OS X, and use EyeTV.
Bad enough, and then I had to upgrade the Mac frontend to match the new version of the MythTV protocol. As if it it never occurred to the guys at MythTV.org that standards should be flexible enough not to require constant changes. Ever notice that on top of every well formed XML file it says <?xml version="1.0" and that people have gotten along pretty well without changing the definition every 3.5 weeks. After a couple of hours of trying to get the frontend to compile on my MacBook, which has XCode 3.1, and an older version of Qt futzing things up I decided that the build scripts really didn't like llvm, and installed XCode 3.0 on the Mac Mini in the basement. Then it was a couple hours of fixing a few errors in the Mac specific code, and realizing the main configure file had to be made executable.
Then after it finally compiled, it was figuring out why I couldn't turn off pinging the database server (bad mysql.txt permissions), then why the frontend quit right away with the only clue being in the Console: "MythTV requires an updated schema". So the database schema had changed in the last 3 days, causing the frontend to quit without any warning. Classy.
And the playback is absolutely horrible on the Mini for some reason involving TOSLink passthrough or problems with linking in a library—I'm not sure. I really ought to buy another Mini, one which I can run as 100% OS X, and use EyeTV.
Saturday, June 14, 2008
Vectored PICT to PDF conversion in your code
You should not be creating any PICT files, but longtime Mac users might have a large number of .pict or .pct files lying around, and some (all?) of the tools Apple provides to open such files and convert them to PDF do a horrid job of it. This applies only to vectored PICTs, as bitmapped PICTS will look bad regardless.
First I refer you to this screen capture of a simple AppleWorks drawing:

You will notice that it looks a little archaic with it's lack of anti-aliasing, but that's what people had in the day.
I copied and pasted this image into TextEdit.app, saved the document as RTFD and then used the "View Contents" item in the Finder contextual menu to get to an actual old fashioned .pict file. This is how hard it is these days to generate a PICT file.
Just to see how not to do this, open the .pict file in OS X Preview.app, zoom in a bit, and you will see that however Preview is rendering PICTs it is doing a pretty poor job of it. I'm guessing it is using QuickTime import to create a bitmapped version.

Tell Preview to save as a PDF and you get this mess:
The basic problem here is that Apple provides an API for rendering a PICT into a Quartz context (and thus into PDF) which preserves the vectored nature of the original and some applications do not use it. If you use this API, your onscreen representation of PICT files will be as good as they can be, and you will be able to export them to comparatively nice PDF files. This does not mean they will look as good as PDFs which had been created from the ground up as such; PICTs lack of support for transparency, Bezier curves, fractional coordinate systems and rotated text make that impossible. But it will look a lot nicer.
Look around the header files for the QDPictToCGContext.h header and you will find:
Then you can use the resulting QDPictRef to draw into a Quartz context, and if that CGContextRef was created via a call to CGPDFContextCreate, then you have created as nice a copy of the original PICT as is possible as seen by this screen shot at 200% zoom:

and this PDF result:
Note that rotated text still looks horrible, as the only way to make QuickDraw draw rotated text onscreen was to draw into an offscreen bitmap and then rotate the pixels in the bitmap. It was possible, however, to insert a series of PicComments which inserted rotated text into a PICT. I've checked this out with PICTs created by a separate application, and the Apple PICT to PDF converter honors these comments. I guess AppleWorks just didn't bother to put them in. [Update: Paragraph rewritten to add extra info, and to hide temporary idiocy on my part.]
Also check out the little bump in the arrow heads, probably a glitch that went unnoticed in the non-antialiased original caused by drawing the shaft too long and not quite on center. Otherwise, the new antialiased version looks much nicer. And you can even select and copy the (non-rotated) text right here in the browser.
It's not unusual, but unfortunate that Apple is inconsistent in using its own API. Preview.app obviously does not, nor does the PICT CoverFlow plugin, but TextEdit.app appears to, resulting in the oddity of a PICT in a CoverFlowed RTFd document looking much better than the CoverFlow of the PICT file itself.
Regardless, I recommend that anybody with a large collection of vectored PICTs make PDF copies of them as there may come a version of Mac OS X which will not have any support for PICTs at all. For instance, I doubt you can view PICTs on an iPhone. Warning: see my other posts about how PDFs do not contain the extra data which allows the originating application to recreate the original document. So keep the originals around too.
[Update: I've corresponded with Thorsten Lemke, proprietor of LemkeSoft and creator of the well known Graphics Converter application. He immediately saw the value in incorporating improved Quickdraw picture processing and conversion in his product and version 6.11 and higher (including this morning's beta) will feature it. I envy the nimbleness of independent software vendors. I can't tell you how long my day job company takes getting a minor release out to our customers.]
[Update: here is the source for a simple automator plugin I threw together to make the conversion.
]
First I refer you to this screen capture of a simple AppleWorks drawing:
You will notice that it looks a little archaic with it's lack of anti-aliasing, but that's what people had in the day.
I copied and pasted this image into TextEdit.app, saved the document as RTFD and then used the "View Contents" item in the Finder contextual menu to get to an actual old fashioned .pict file. This is how hard it is these days to generate a PICT file.
Just to see how not to do this, open the .pict file in OS X Preview.app, zoom in a bit, and you will see that however Preview is rendering PICTs it is doing a pretty poor job of it. I'm guessing it is using QuickTime import to create a bitmapped version.
Tell Preview to save as a PDF and you get this mess:
The basic problem here is that Apple provides an API for rendering a PICT into a Quartz context (and thus into PDF) which preserves the vectored nature of the original and some applications do not use it. If you use this API, your onscreen representation of PICT files will be as good as they can be, and you will be able to export them to comparatively nice PDF files. This does not mean they will look as good as PDFs which had been created from the ground up as such; PICTs lack of support for transparency, Bezier curves, fractional coordinate systems and rotated text make that impossible. But it will look a lot nicer.
QDPictRef LoadPICTFromURL(const CFURLRef& pictFileLocation)
{// warning, I did not actually compile or test this code
QDPictRef result = 0;
CGDataProviderRef dataProvider = CGDataProviderCreateWithURL(pictFileLocation);
result = QDPictCreateWithProvider(dataProvider);
CFRelease(dataProvider);
return result;
}
Look around the header files for the QDPictToCGContext.h header and you will find:
QDPictDrawToCGContext( CGContextRef ctx, CGRect rect, QDPictRef pictRef);
Then you can use the resulting QDPictRef to draw into a Quartz context, and if that CGContextRef was created via a call to CGPDFContextCreate, then you have created as nice a copy of the original PICT as is possible as seen by this screen shot at 200% zoom:
and this PDF result:
Note that rotated text still looks horrible, as the only way to make QuickDraw draw rotated text onscreen was to draw into an offscreen bitmap and then rotate the pixels in the bitmap. It was possible, however, to insert a series of PicComments which inserted rotated text into a PICT. I've checked this out with PICTs created by a separate application, and the Apple PICT to PDF converter honors these comments. I guess AppleWorks just didn't bother to put them in. [Update: Paragraph rewritten to add extra info, and to hide temporary idiocy on my part.]
Also check out the little bump in the arrow heads, probably a glitch that went unnoticed in the non-antialiased original caused by drawing the shaft too long and not quite on center. Otherwise, the new antialiased version looks much nicer. And you can even select and copy the (non-rotated) text right here in the browser.
It's not unusual, but unfortunate that Apple is inconsistent in using its own API. Preview.app obviously does not, nor does the PICT CoverFlow plugin, but TextEdit.app appears to, resulting in the oddity of a PICT in a CoverFlowed RTFd document looking much better than the CoverFlow of the PICT file itself.
Regardless, I recommend that anybody with a large collection of vectored PICTs make PDF copies of them as there may come a version of Mac OS X which will not have any support for PICTs at all. For instance, I doubt you can view PICTs on an iPhone. Warning: see my other posts about how PDFs do not contain the extra data which allows the originating application to recreate the original document. So keep the originals around too.
[Update: I've corresponded with Thorsten Lemke, proprietor of LemkeSoft and creator of the well known Graphics Converter application. He immediately saw the value in incorporating improved Quickdraw picture processing and conversion in his product and version 6.11 and higher (including this morning's beta) will feature it. I envy the nimbleness of independent software vendors. I can't tell you how long my day job company takes getting a minor release out to our customers.]
[Update: here is the source for a simple automator plugin I threw together to make the conversion.
#import "QuickDrawPictureToPDFExporter.h"
QDPictRef LoadPICTFromURL(const CFURLRef pictFileLocation)
{
QDPictRef result = 0;
CGDataProviderRef dataProvider = CGDataProviderCreateWithURL(pictFileLocation);
result = QDPictCreateWithProvider(dataProvider);
CFRelease(dataProvider);
return result;
}
@implementation QuickDrawPictureToPDFExporter
CGContextRef CreatePDFContext(const CGRect mediaRect, CFMutableDataRef theData)
{
CGContextRef result = 0;
if(theData != 0)
{
CGDataConsumerRef theConsumer =CGDataConsumerCreateWithCFData(theData);
if(theConsumer != 0)
{
result = CGPDFContextCreate(theConsumer, &mediaRect, NULL);
CGDataConsumerRelease(theConsumer);
}
}
return result;
}
- (id)runWithInput:(id)input fromAction:(AMAction *)anAction error:(NSDictionary **)errorInfo
{
NSMutableArray* output = [NSMutableArray array];
if (![input isKindOfClass:[NSArray class]])
{
input = [NSArray arrayWithObject:input];
}
NSEnumerator *enumerator = [input objectEnumerator];
NSString* aPath =nil;
while (aPath = [enumerator nextObject])
{
NSURL *inputURL = [NSURL fileURLWithPath:aPath];
QDPictRef aPictRef = LoadPICTFromURL((CFURLRef) inputURL);
BOOL drawn = NO;
CFMutableDataRef theData = 0;
if(aPictRef)
{
CGRect mediaRect = QDPictGetBounds(aPictRef);
theData =CFDataCreateMutable(NULL, 0);
CGContextRef cgContext = CreatePDFContext(mediaRect, theData);
if(cgContext != 0)
{
CGContextBeginPage(cgContext, &mediaRect);
CGContextSaveGState(cgContext);
QDPictDrawToCGContext(cgContext,mediaRect,aPictRef);
CGContextEndPage(cgContext);
CGContextRestoreGState(cgContext);
CGContextFlush(cgContext);
drawn = YES;
CGContextRelease(cgContext);
}
QDPictRelease(aPictRef);
}
else
{
NSString *errorString = NSLocalizedString(@"Picture to PDF could not read the input file as a PICT.",
@"Could not read input file");
*errorInfo = [NSDictionary dictionaryWithObjectsAndKeys: [errorString autorelease],
NSAppleScriptErrorMessage, nil];
}
if(drawn && theData)
{
NSString* outputPath = [[aPath stringByDeletingPathExtension] stringByAppendingPathExtension:@"pdf"];
if (![(NSData*)theData writeToFile:outputPath atomically:YES])
{
NSString *errorString = NSLocalizedString(@"Picture to PDF could not could not create output file.",@"Couldn't write data to file");
*errorInfo = [NSDictionary dictionaryWithObjectsAndKeys: [errorString autorelease], NSAppleScriptErrorMessage, nil];
}
else
{
[output addObject:outputPath];
}
}
else
{
NSString *errorString = NSLocalizedString(@"Picture to PDF could not draw the File.",
@"Could not draw output file");
*errorInfo = [NSDictionary dictionaryWithObjectsAndKeys:
[errorString autorelease], NSAppleScriptErrorMessage, nil];
}
if(theData)
{
CFRelease(theData);
theData = 0;
}
}
return output;
}
@end
]
Thursday, May 15, 2008
Keep your Frameworks from Metastasising
Think Class Library, MacApp, PowerPlant. What do these frameworks have in common? They are all on my resume and they are all dead. Qt, Cocoa, AWT(?), Swing, MFC(?), .NET. What do these frameworks have in common? They are not dead, yet. The question mark indicates a framework in the zombie state of not being improved upon, but being used by too many people to be considered dead.
And, I am glad that frameworks die. I would rather not live in a world where 23 years later, MacApp was the best we could do. It was a pain to work with and I'm much happier with Cocoa or even Qt. People learned what was wrong with the old frameworks and made better ones. (If only this were true of MFC.) And, baring asteroid collision, people will come up with new frameworks in the future.
And that is the rub. Frameworks get you 2 ways. 1) they usually lock you into writing all your code in a particular language. If the next great application framework uses Python, and all your work is in C++, you're going to be awfully busy. 2) You drink the Kool-Aid, and use Framework classes everywhere. If you are a Qt user, half your methods take QStrings, and when the time comes when you want to switch from Qt to Cocoa, you will be re-factoring for weeks.
The language lockin is tough. I've known for years that C++ is not really a great application language; it may be a decent OS level language, but it is too fragile, too demanding, too static, and too complicated for making reliable desktop applications. But what else are you going to use for cross-platform apps? Maybe .NET will kill C++ as a cross-platform language. I don't know; it'll be a while. Hopefully, an appropriate cross-platform language will rise up to take its place. Which is part of my point, 10 years from now, we will not be using the same language on the same framework to do development. Something better will arise.
Getting back to the point of this entry, you want your code to last longer than any one framework. It's a matter of amortization. The longer your code lasts doing useful work, the higher the payoff for writing it. Over the 16 years I've been programming Macs, I've used 5 frameworks, and 3 of them are dead. I am not predicting the imminent death of Cocoa or Qt, far from it, but die they will, and I should have a backup plan. And here is the other rub, my backup plan 'til now has been to write cross-platform, pure C++ code using a lot of STL and boost, and try to keep the platform specific code from creeping into the general purpose code. But I just said, perhaps prematurely and wistfully, that C++ itself might fall out of favor with framework developers, which both makes cross-platform development more difficult without a Lingua Franca, and negates the hedge against your framework dying.
So I don't have a long term solution, but in the meantime do what has been good advice. Keep your use of frameworks limited to a thin GUI layer on top. Abstract the interfaces between your code and the framework. Abstract, abstract, abstract. To the extent practical, do not propagate framework classes into the meat of your codebase. If people had done this in the past, they could have skipped from MacApp to PowerPlant to Qt with a song in their heart instead of the crushing pain it was for most folks. Do not get locked in without a very good reason.
And know when it is time to scrap your live's work and move to a new language.
"It just so happens that your friend here is only MOSTLY dead. There's a big difference between mostly dead and all dead. Mostly dead is slightly alive." --Miracle Max in The Princess Bride
And, I am glad that frameworks die. I would rather not live in a world where 23 years later, MacApp was the best we could do. It was a pain to work with and I'm much happier with Cocoa or even Qt. People learned what was wrong with the old frameworks and made better ones. (If only this were true of MFC.) And, baring asteroid collision, people will come up with new frameworks in the future.
And that is the rub. Frameworks get you 2 ways. 1) they usually lock you into writing all your code in a particular language. If the next great application framework uses Python, and all your work is in C++, you're going to be awfully busy. 2) You drink the Kool-Aid, and use Framework classes everywhere. If you are a Qt user, half your methods take QStrings, and when the time comes when you want to switch from Qt to Cocoa, you will be re-factoring for weeks.
The language lockin is tough. I've known for years that C++ is not really a great application language; it may be a decent OS level language, but it is too fragile, too demanding, too static, and too complicated for making reliable desktop applications. But what else are you going to use for cross-platform apps? Maybe .NET will kill C++ as a cross-platform language. I don't know; it'll be a while. Hopefully, an appropriate cross-platform language will rise up to take its place. Which is part of my point, 10 years from now, we will not be using the same language on the same framework to do development. Something better will arise.
Getting back to the point of this entry, you want your code to last longer than any one framework. It's a matter of amortization. The longer your code lasts doing useful work, the higher the payoff for writing it. Over the 16 years I've been programming Macs, I've used 5 frameworks, and 3 of them are dead. I am not predicting the imminent death of Cocoa or Qt, far from it, but die they will, and I should have a backup plan. And here is the other rub, my backup plan 'til now has been to write cross-platform, pure C++ code using a lot of STL and boost, and try to keep the platform specific code from creeping into the general purpose code. But I just said, perhaps prematurely and wistfully, that C++ itself might fall out of favor with framework developers, which both makes cross-platform development more difficult without a Lingua Franca, and negates the hedge against your framework dying.
So I don't have a long term solution, but in the meantime do what has been good advice. Keep your use of frameworks limited to a thin GUI layer on top. Abstract the interfaces between your code and the framework. Abstract, abstract, abstract. To the extent practical, do not propagate framework classes into the meat of your codebase. If people had done this in the past, they could have skipped from MacApp to PowerPlant to Qt with a song in their heart instead of the crushing pain it was for most folks. Do not get locked in without a very good reason.
And know when it is time to scrap your live's work and move to a new language.
...watch the things you gave your life to, broken, And stoop and build 'em up with worn-out tools... --Kipling
Tuesday, May 13, 2008
Quartz to PDF, versus PS to PDF
I am ankle deep in Postscript code at my day job, so as a refresher I took an afternoon and hand encoded a business card for my wife, who is starting a side business arranging academic tours of China (for the Peking University Department of Philosophy and Religion). Obviously, most people would be better served creating a card in InDesign, or other vectored drawing editor, but this was a learning experience.
Here it is with the contact info scrubbed.
Postscript is not a friendly language, but it was simple enough creating a .ps file in my text editor—BBEdit—and just dragging and dropping the icon from the editor's title bar onto the OS X Preview application. Preview has the convenient feature of auto-magically converting Postscript files to PDF. Of course, compilation errors result in a cryptic failure dialog, but these were enough development tools for an afternoon's project. If I were to do this on a more regular basis, I'd compile a simple app which provided a message callback to the CGPSConverterCreate routine.
Consider how much easier it would be to create this file in Quartz, and how much better the output would look.
The fact of the matter is that Apple has done a lot of heavy lifting for us either via proprietary extensions to PDF, or taking the extra effort of providing optimized support for font embedding. An afternoon spent trying to keep track of an unruly parameter stack was enough for me to appreciate how much power Quartz gives us, and how easy it is to call upon.
Here it is with the contact info scrubbed.
Postscript is not a friendly language, but it was simple enough creating a .ps file in my text editor—BBEdit—and just dragging and dropping the icon from the editor's title bar onto the OS X Preview application. Preview has the convenient feature of auto-magically converting Postscript files to PDF. Of course, compilation errors result in a cryptic failure dialog, but these were enough development tools for an afternoon's project. If I were to do this on a more regular basis, I'd compile a simple app which provided a message callback to the CGPSConverterCreate routine.
Consider how much easier it would be to create this file in Quartz, and how much better the output would look.
- Font handling is hard in Postscript, even if you don't have to embed descriptions of your fonts. Thus my use of standard Postscript fonts Times and Helvetica.
- Kerning is not automatic in Postscript. Yes, you can use kshow to manually set the spacing between pairs of characters. No, I'm not going to do that. Thus, the odd spacing between letters.
- If this file had included characters from non-Roman languages, complications would arise as Postscript only allows 256 characters per font encoding, potentially requiring multiple font definitions per face. None of the free Unicode support in Core Text/Quartz.
- I might have used a little transparency, but there's no such thing in Postscript.
- Postscript is hard to read and maintain. The extra syntax needed to keep the parameter stack organized, distracts the eye away from the actual algorithm being described. Postscript is certainly more compact in the editor, by an order of magnitude, than a series of Quartz API calls but much of that compactness is wasted pushing, popping, dup'ing, and rolling the parameter stack. I mean, just look at it: [Update: changed ATSUI to Core Text]
gsave
basefont [9 0 0 9 0 0] makefont setfont
(Nashua, NH 03061)
dup
stringwidth pop 2 div neg 0 rmoveto
show
grestore
The fact of the matter is that Apple has done a lot of heavy lifting for us either via proprietary extensions to PDF, or taking the extra effort of providing optimized support for font embedding. An afternoon spent trying to keep track of an unruly parameter stack was enough for me to appreciate how much power Quartz gives us, and how easy it is to call upon.
Labels:
Postscript,
Quartz
Thursday, April 03, 2008
What does it mean when your app launches faster in Debug?
Ummm, maybe you forgot to uncheck "Open using Rosetta" in the release build?
I was enthusiastically awaiting how fast my day job application would launch now that I've eliminated the slowest thing. The debug build launched fast, the release build launched fast under instrumentation, the release build launched really slow from the Finder. What?? Was about to go up to bed, when it occurred to me I had switched on Rosetta for something months ago, and might have just forgotten to turn it off. Yep. Application launches quickly now. Users might even notice when this gets to them in a couple months.
By the way, when you are using Apple's Core Foundation XML Parser, you really really want to use CFTreeGetFirstChild followed by a series of CFTreeGetNextSibling calls to traverse the elements. You really don't want to call CFTreeGetChildAtIndex on anything approaching a large XML document.
I was enthusiastically awaiting how fast my day job application would launch now that I've eliminated the slowest thing. The debug build launched fast, the release build launched fast under instrumentation, the release build launched really slow from the Finder. What?? Was about to go up to bed, when it occurred to me I had switched on Rosetta for something months ago, and might have just forgotten to turn it off. Yep. Application launches quickly now. Users might even notice when this gets to them in a couple months.
By the way, when you are using Apple's Core Foundation XML Parser, you really really want to use CFTreeGetFirstChild followed by a series of CFTreeGetNextSibling calls to traverse the elements. You really don't want to call CFTreeGetChildAtIndex on anything approaching a large XML document.
Monday, March 17, 2008
iPhone App Ideas: A Target Rich Environment
I was watching a recent episode of Anthony Bourdain's: No Reservations on the Travel Channel this morning, and came up with an idea for an application.
Bourdain described the process in which orders are conveyed through a restaurant: customer tells waiter, waiter writes on order slip, waiter goes to waiter station and inputs order into terminal, order gets directed to the appropriate kitchen personnel. I was struck by how the input terminal involved bringing up a diagram of the appropriate table, and inputting the order for each chair around the circle. This seemed something easily transferable to an iPod Touch style interface.
I'm imagining a situation where the customer tells the waiter, the waiter touches on the table in a map of the restaurant, touches on the seat, is given a series of menu choices corresponding to things on the menu, touches complete, and the order is sent directly to the kitchen. Maybe the customer orders shrimp and the kitchen just ran out of shrimp, the software would show an alert.
In this scenario, the waiter only has to enter the data once, there is less chance of transcription errors, the restaurant does not have to set aside valuable floor space for the waiter station, and status information is more current. Presumably, the people best positioned to make such software is whoever makes the current restaurant terminals, but some enterprising types could get into the business right now while the going is good, by providing an entire Mac/iPod based solution which incorporated the whole system: data entry, accounting, kitchen display screens, etc. If anyone uses this idea, cash is always an appropriate gift.
It would be nice if Apple were to come up with an industrial version of the iPod Touch that was not worth stealing as a media player for specialized situations like this.
Bourdain described the process in which orders are conveyed through a restaurant: customer tells waiter, waiter writes on order slip, waiter goes to waiter station and inputs order into terminal, order gets directed to the appropriate kitchen personnel. I was struck by how the input terminal involved bringing up a diagram of the appropriate table, and inputting the order for each chair around the circle. This seemed something easily transferable to an iPod Touch style interface.
I'm imagining a situation where the customer tells the waiter, the waiter touches on the table in a map of the restaurant, touches on the seat, is given a series of menu choices corresponding to things on the menu, touches complete, and the order is sent directly to the kitchen. Maybe the customer orders shrimp and the kitchen just ran out of shrimp, the software would show an alert.
In this scenario, the waiter only has to enter the data once, there is less chance of transcription errors, the restaurant does not have to set aside valuable floor space for the waiter station, and status information is more current. Presumably, the people best positioned to make such software is whoever makes the current restaurant terminals, but some enterprising types could get into the business right now while the going is good, by providing an entire Mac/iPod based solution which incorporated the whole system: data entry, accounting, kitchen display screens, etc. If anyone uses this idea, cash is always an appropriate gift.
It would be nice if Apple were to come up with an industrial version of the iPod Touch that was not worth stealing as a media player for specialized situations like this.
Wednesday, March 12, 2008
The iPhone and the End of Bloatware
You still see tables filling full page ads in computer magazines, a column of checks for the product being advertised, a column of blanks under the competitors. This was the game of the big company, the Microsofts of the world; organizations with large enough staffs to code solutions to any need or perceived need of any customer big or small. Every customer needs 10% of Microsoft Office, but every customer needs a different 10%. Smaller, more nimble companies have made better pure word processors, but they have all been relegated to niche markets, as the vast majority of customers know that somewhere in Office are the exact features they need, plus a bunch of features they might need someday for free. It's just a matter of finding them. And in a world of full keyboards, contextual menus, 24" monitors, gigabytes of RAM, terabytes of storage, multi-cores, and fast CPUs, there's always somewhere to tuck a feature.
Now comes the iPhone/iPod Touch with its 480x320 screen, no keyboard shortcuts, finite memory and demand for near instantaneous launch and quit times. Ponder this small subset of toolbars for Word 2003 on Windows:

and imagine that each icon was twice as big to accomodate the stubby adult forefinger. Imagine making even this small subset of functionality accessible somehow while still showing content. I've a good imagination, but it fails me here.
Anti-bloat features of the iPhone OS:
No, what we will get are mini-apps. Apps a single competent programmer will bang out and polish in 3 months or less. The functionality which makes up Office could be broken up into scores of such mini-apps: a mini-table creator, a mini-slide presenter,a separate mini-slide editor, a database client, a little drawing app. In fact, even these small apps will probably be too big, and will be streamlined for more specific task oriented, specialized purposes: a blog post word processor, a series of charting modules which each do one type of chart, a specialized Keynote which only has the tools needed to create presentations following a single corporations style—imagine all those exactly the same style presentations you see at Apple events. That would be doable on an app which could only show 5 toolbar icons at once, and which relied upon gestures, animation, templates and transparency to maximize what the user can see and do. And, lots of apps to help with one's location based social networking, whatever that is.
In the iPhone ecosystem, bloatware is non-adaptive. The big have no feature advantage over the small.
The experienced Cocoa programmer does have an advantage, however. I have been reading through the Apple example code, and it is made up of delightful exercises in object oriented minimalism. Actual, full blown apps written in this style will be incredibly light weight for the functionality delivered, with zippy launch times, and responsive interfaces. This seems to be how one masters a framework, and it is especially true of Cocoa, by learning how to do the most with the least code, which is again, the anti-thesis of the bloatware mentality, which seems to only care about churning as much code out as possible.
Now comes the iPhone/iPod Touch with its 480x320 screen, no keyboard shortcuts, finite memory and demand for near instantaneous launch and quit times. Ponder this small subset of toolbars for Word 2003 on Windows:
and imagine that each icon was twice as big to accomodate the stubby adult forefinger. Imagine making even this small subset of functionality accessible somehow while still showing content. I've a good imagination, but it fails me here.
Anti-bloat features of the iPhone OS:
- Apps must launch quickly (in practice anything less than a second will seem intolerable), and quit as fast or faster
- Apps must be relaunched every time you move back from another app.
- There is a practical limit of 10 or so toolbar type icons per view, and 5 is more reasonable. This leads to a geometric decrease in accessible functionality versus a program on a PC. If you can fit 40 toolbar icons on your computer's monitor, and each opens a dialog with 40 items, and each dialog item invokes a subdialog with 40 features you have 40x40x40-40-40=63,920 accessible features versus 5x5x5-5-5=115 features 3 levels deep on an iPhone (well you could probably fit more than 5 items per view, but you get the idea)
- There are no contextual menus, much less multi-level contextual menu trees
- There are no keyboard shortcuts, eliminating the need for laminated keyboard shortcut hint sheets Scotch taped to the back of the iPhone
- There are gestures, but really, how many unique gestures can you expect a person to master: pinch, shake...?
- There is a finite amount of RAM, and telling the user to "just add more" is not an option
- Lack of both RAM and application interaction are a firewall against the metastasized bloatware which are "Office Suites," which have to at least pretend to work together
No, what we will get are mini-apps. Apps a single competent programmer will bang out and polish in 3 months or less. The functionality which makes up Office could be broken up into scores of such mini-apps: a mini-table creator, a mini-slide presenter,a separate mini-slide editor, a database client, a little drawing app. In fact, even these small apps will probably be too big, and will be streamlined for more specific task oriented, specialized purposes: a blog post word processor, a series of charting modules which each do one type of chart, a specialized Keynote which only has the tools needed to create presentations following a single corporations style—imagine all those exactly the same style presentations you see at Apple events. That would be doable on an app which could only show 5 toolbar icons at once, and which relied upon gestures, animation, templates and transparency to maximize what the user can see and do. And, lots of apps to help with one's location based social networking, whatever that is.
In the iPhone ecosystem, bloatware is non-adaptive. The big have no feature advantage over the small.
The experienced Cocoa programmer does have an advantage, however. I have been reading through the Apple example code, and it is made up of delightful exercises in object oriented minimalism. Actual, full blown apps written in this style will be incredibly light weight for the functionality delivered, with zippy launch times, and responsive interfaces. This seems to be how one masters a framework, and it is especially true of Cocoa, by learning how to do the most with the least code, which is again, the anti-thesis of the bloatware mentality, which seems to only care about churning as much code out as possible.
Subscribe to:
Posts (Atom)