Wednesday, December 29, 2010

On Keeping Politics Out of Non-Political Blogs

This is a blog about technology and programming, that is its mission, and why readers find it*. They do not come for my political opinions. As far as my readers are concerned, I don't have them. It would be betrayal of the trust between me, and whomever visits my blog to insert snarky little bits of politics. I certainly feel betrayed and imposed upon when an otherwise fine blog or podcast feels the need to sneak in something political; do it too often and that podcast is out of my iPhone, i.e., Grammar Girl. Same with blogs, i.e., Roughly Drafted.

This comes up because of the last several minutes of the Engadget Podcast of Christmas 2010. Now the Engadget Podcast is my favorite podcast, I will actually re-listen to some episodes. It can be very funny, although the Engadget Bingo thing is lame, and it's about gadgets, so what's not to love? Well, when Joshua Topolsky and Nilay Patel decide that doing their actual job of putting on a gadget podcast and making amusing pop culture references is too unimportant and start verbally abusing Paul Miller, who disagrees with them, but more to the point, would rather talk about gadgets.

If Joshua or Nilay would like to build up an audience for a political podcast, that would be swell**. But taking their pre-existing audience, who's only shared interest is the love of devices, and subjecting them to their views is a breach of the compact that comes from their podcast being listed in the Gadgets section of iTunes.

Is this post an imposition on my readers? Well a bit. In it's defense, it is a meta-post about political content, not that content itself. And I have few frequent readers to offend.


* I'm not claiming I have regular readers, or many readers. I have people Googling specific technical problems and never coming back. If I'm lucky, this post will be read by a couple friends of mine, as it serves no practical purpose other than me venting.

** I wouldn't actually listen to such a podcast, because for whatever reason I have no interest in podcasts or TV shows about politics, although I read a large number of political blogs.

Tuesday, December 21, 2010

On The Need for MonoMac

I continue to listen to the Dot Net Rocks podcast even though I've yet to write a line of C#. It is a well done and even handed podcast, and one of these days I shall write a post about how the hosts demonstrate weekly how one can advocate a technology while maintaining credibility. But this post is about MonoMac, which I haven't used but someone at my company is, and I go to the meetings. If this means I'm writing without enough facts, then this might be a blog.

[Update: I've been informed that what I've written below is not possible in MonoMac. Apparently, the MonoMac developer community cannot imagine the concept that C# isn't the solution for every programming situation. I am not a publicly profane person, but I'm tempted here. But please keep on reading about an alternate universe where programmers don't fall in love with a language at the expense of reason.]


Tireless advocate for all things Mono, Miguel de Icaza was on the show this week and he got me thinking about why anyone would use MonoMac. And I do think many people will use MonoMac, but (I hope) not for the reasons Mr. de Icaza gave on the podcast.

To summarize de Icaza's pitch for using MonoMac. If you love C# and managed code, then you can use your favorite language to write Mac applications. You'll have to do at least some work outside your beloved Visual Studio, and you'll have to use completely unfamiliar APIs but at least it'll be in C#. And you'll get to use the whole Cocoa API, plus various other OS X technologies like Core Audio, Core Animation, etc., but they'll be wrapped in C# shims. Oh, and you'll get a strongly typed language instead of the horrors of dealing with the chaos of loose typing.

As someone who's spent his professional life writing cross-platform code, this is a really bad and limiting sales pitch. It is pitched at people who just don't want to learn another language for whatever reason, but would be willing to learn the entire OS X API, except trapped in a language for which it's calling syntax becomes cumbersome.

First, of all, you have to realize that the Objective-C language is small. It is not C++, which even a professional coder might never learn in its entirety. It is a elegant little language which adds object oriented extensions to C. The fact that you spend most of your time using those extensions doesn't make them any more verbose. It has had some further extensions in recent years, but it is still reasonably compact.

Also realize that the OS X APIs are huge, composed of literally thousands of messages, methods, structures, enumerations and constants in dozens of frameworks. I've been programming on OS X for a decade, and Apple adds APIs at a rate faster than I can learn them.

So, MonoMac saves you the trouble of learning a small language, while expecting you to master a large API which has been transmogrified to make sense to C# programmers. And the vast majority of the documentation and forum postings for that API assume it is in another language. That's not much of a pitch.

I used to work on a karyotyping application when I was living in suburban Chicago. It was Mac/PC and was written in C++. 90% of our code was shared between the platforms. We used standard application frameworks, at the time PowerPlant and MFC which relieved us of maintaing code every application uses, and gave us a lightweight entry into native appearances and behaviors. Almost all the document data structure and rendering code was cross-platform. So 10% of the code was dealing with the user facing part of the application or calling into native operating system services and the rest was vanilla C++ dealing with an abstract environment.

Consider a typical Model-View-Controller design. If one has a shared language like C# running in Mono, then we can factor our code such that all or almost all of the Model code is in shared vanilla C#, some or even most of the Controller code can be written in C#, although I'd prefer that Objective-C be used for Controllers on the Mac, and the bulk of the View code is no code at all, but what you get from the framework by using standard widgets, windows and other classes using the provided GUI tools. Basically, you want the framework to handle everything outside your document frame in your window. Depending on the application, you can have very high code reuse, or less if you need high performance access to operating system services, such as the example de Icaza gave of using the Audio Units API for sound processing. Regardless, this should be abstracted away and created with factories to make it callable from vanilla code.

So you'd end up with 4 kinds of code:
Vanilla C# that doesn't do much beyond manipulate vanilla objects and simple types, and implement various abstract interfaces or protocols.
Windows specific C# that makes use of Windows frameworks
MonoMac specific C# that makes use of native Mac services
Objective-C for Mac specific GUI code and isn't as awkward as C# would be.

For a lot of applications the first kind can be the bulk of code.


This takes discipline but it results in a more maintainable, focused code base.

I think that's a better pitch.

Not that I'm advocating it, I haven't tried to see if my decade old experiences with cross-platform design are transferable to the Mono runtime environment. Nor do I think Mono has proven itself enough for risking a project on it. I was floored that such a well known project only has 200 or so active users.

And I like Objective-C, it's a lot of fun.

Wednesday, November 03, 2010

Logitech Still Sells the Z-5500

I just yesterday replaced the decade old Logitech Z-5500 Digital speaker system in my TV room. It had spent several years on my desk when I had an apartment, and then spent the last 5 years giving surround sound to my house. It gave me reliability and a nice set of inputs. As a gadget guy, I'm just shocked that something as archaic is still being actively sold. Go look at its specification page and it touts its “Analog stereo-mini (on side panel of control center) for portable CD, MP3,or MiniDisc® players”. Now that's a blast from the past, the MiniDisc and portable CD player.

I replaced the Z-5500 with a Denon AVR 591 which is a real receiver with modern HDMI ports, modern audio codecs and an auto calibration system. Still, I am going to miss the conveniently small control module of the Z-5500.

Thursday, September 30, 2010

New version of Signal GH

Just got a note from Apple saying that a new version of my Signal GH iOS app to monitor over the air TV signal quality with HDHomeruns was approved. Which is good. I make little money on the product, but I like tweaking it.

It has two major features: an iPad layout and a map of broadcast towers in the United States (sorry Canada).
Both of these features were a lot of fun. The size of an iPad screen is luxurious and freeing. In this case, I just scaled everything up, as I think the tabbed interface works for this particular application. It might have been more iPadish if I'd put the settings in a popup control, but I felt people would be spending a good amount of time in setup, and would get nervous spending too much time in a spring loaded widget. Other iPad apps I've written have gone whole hog for using popups for accessing functionality.

The map was a joy to create. MapKit is a great example of a framework that Apple just gets right. Getting my TV towers from a Core Data database to the screen could not have been easier due to the flexible use of protocols and categories. However, the standard drop pin wouldn't work because TV stations tend to share a single tower. So I went with a custom flower petal annotation giving me room for 7 major networks and a generic logo.

Compass support was nearly as easy, although I did have to use my own custom annotation for that, when it would be nice just to set a flag and have it added. On the other hand, I felt the standard radar wave animation for the user's current location was too distracting and brought attention to the wrong detail, so I replaced it with a more static graphic merged with the compass.

Thanks to my sister, Sarah Howes, for doing the research needed for populating my TV station database.

Wednesday, September 15, 2010

Finally! Apple Embraces A Standard for Metadata in PDFs

I draw your attention to the header file CGPDFContext.h in the iOS 4 SDK:

void CGPDFContextAddDocumentMetadata
(CGContextRef context, CFDataRef metadata) CG_AVAILABLE_STARTING(__MAC_10_7, __IPHONE_4_0);


The iOS 4 SDK has been out for several months but I hadn't noticed this change until today. Not that I have any use for it today on iOS, it's on the Mac where it is crucially needed.

Starting with Mac OS X 10.7, developers will be able to embed arbitrary XML data in the PDFs they can generate with Apple's APIs. [Update: I guess there is no actual requirement of XML based on just the API. I'd recommend standardizing on XML though.] They have decided to use the method advocated by Adobe in which a metadata stream object with a compressed XML payload is inserted into a PDF (Apple does the compression for you, you just have to provide a block of XML data). This is a good way of doing it, but more importantly, it is a simple and standard way of doing it. I would have preferred having an additional vendor tag where I could mark the metadata with a "com.genhelp.mydrawingapp" identifier, but that is not crucial; I can get that data from the XML.

Why is this Important?

As I've explained several times before, on the Mac we used to have this feature which I will call Round Trip Editing, wherein a user could make a drawing in one application, copy and paste the drawing into another application, and then later copy and paste back into the original application, and still be able to edit the drawing. Generations of Mac users relied on this feature to go from applications like ChemDraw into PowerPoint and back again.

This feature has been lost as applications transition from using the archaic PICT clipboard flavor to the modern and beautiful PDF clipboard flavor. There was no direct way to embed large data in Apple generated PDFs, and thus developers were left on their own to munge the format if they dared. And, with no standard or expectation of data embedding, applications did not bother to preserve the original PDF resulting in data loss.

Getting round trip editing working again has required 3 steps. Apple has had to provide an API for data embedding. Content generating applications have to be modified to use that API. Office applications have to be modified to return original PDFs to the clipboard when selecting a single image. With Mac OS X 10.7, step one will be here; about 4 OS versions tardy.


BTW Here is how to call it:

CFDataRef MakeAPDF(CFDataRef someXML)
{

CGRect mediaRect = CGRectMake(0, 0, 400, 600);
// use your own rect instead

CFMutableDataRef result = CFDataCreateMutable(kCFAllocatorDefault, 0);
CGDataConsumerRef PDFDataConsumer = CGDataConsumerCreateWithCFData(result);

// mark the PDF as coming from your program
CFMutableDictionaryRef auxInfo = CFDictionaryCreateMutable(kCFAllocatorDefault, 1, NULL, NULL);
CFDictionaryAddValue(auxInfo, kCGPDFContextCreator, CFSTR("Your Programs Name"));
CFDictionaryRef auxillaryInformation = CFDictionaryCreateCopy(kCFAllocatorDefault, auxInfo);
CFRelease(auxInfo);

// create a context to draw into
CGContextRef graphicContext = CGPDFContextCreate(PDFDataConsumer, &mediaRect, auxillaryInformation);
CFRelease(auxillaryInformation);
CGDataConsumerRelease(PDFDataConsumer);

// actually make the call to embed your XML
CGPDFContextAddDocumentMetadata(graphicContext, metaData);

CGContextBeginPage(graphicContext, &mediaRect);
// do your drawing, like this grey rectangle
CGContextSetGrayFillColor(graphicContext, 0.5, 0.5);
CGContextAddRect(graphicContext, mediaRect);
CGContextFillPath(graphicContext);
// end your drawing

CGContextEndPage(graphicContext);
CGContextFlush(graphicContext);
CGPDFContextClose(graphicContext);
return result;

}

And here's how to get the data back. Check that the XML is yours instead of some other program's.
CFDataRef ExtractMetaDataFromPDFData(CFDataRef pdf)
{

CFDataRef result = 0;

CFRetain(pdf);
const UInt8 * pdfData = CFDataGetBytePtr(pdf);
CFIndex pdfDataLength = CFDataGetLength(pdf);
CGDataProviderRef dataProvider = CGDataProviderCreateWithData(kCFAllocatorDefault, pdfData, pdfDataLength, NULL);
CGPDFDocumentRef pdfDocument = CGPDFDocumentCreateWithProvider(dataProvider);
CGDataProviderRelease(dataProvider);

if(pdfDocument)
{
CGPDFDictionaryRef docDict = CGPDFDocumentGetCatalog(pdfDocument);
CGPDFStreamRef metastream = 0;
if(CGPDFDictionaryGetStream(docDict,"Metadata", &metastream))
{
CGPDFDataFormat format = CGPDFDataFormatRaw;
CFDataRef streamData = CGPDFStreamCopyData(metastream, &format);
if(streamData)
{
if(format == CGPDFDataFormatRaw)
{
result = streamData;
CFRetain(result);
}
}
}
CGPDFDocumentRelease(pdfDocument);
}
CFRelease(pdf);

return result; // check to see if this is your XML
//remember to release result when done

}

Monday, August 23, 2010

The Three Numbers in Computer Programming

When I was working at a small company in Illinois, Vysis, my team had an architect named Ian Poole who taught me many things, and one of them was the simple fact that there are only three numbers in software design: zero, one and every. By this he meant, you can support the case where there are zero copies of an entity, one copy of an entity or an arbitrary number.

I ran into this problem when revamping my iPhone app,  Signal GH, to monitor antenna signal strength from an HDHomerun. I had mistakenly written it to handle two tuners in a single device. The code was littered to references to the "yellow" tuner and the "cyan" tuner, as that was the colors of their graphs. Well, Silicon Dust came out with a version with a single tuner, which sort of shot me in the foot. And then I bought that device which gave me three tuners on my network. I had a painful time backing out my bad design choice in favor of a more arbitrary number (I did limit it to keeping track of 8 tuners, but that number is kept in only one place and could be changed in a few minutes.)

The new version will be out shortly, fighting off one last bug.

Thursday, July 08, 2010

Wednesday, July 07, 2010

iPhone 4 improves on audio output isolation

I often hook up my iPhone to my car stereo and listen to podcasts while driving. I tend to drive with one earphone in place so I can click to pause or answer phone calls. In the past, whenever I turned off my car's stereo, I could still hear a indistinct and annoying noise coming through my headphones even though the sound was supposed to be going out the line level output in the phone's dock connector. Now, with the iPhone 4, it's quiet. Just another little improvement.

Tuesday, June 01, 2010

On Fixing the Personal Computer's Original Sin

A conversation with an old friend the other day brought up the limited file handling ability of the iPad. Apple really doesn't want you to deal with files as such on the iPad. When the original iPod came out, there was a lot of criticism about not allowing the user to maintain their own MP3 file trees and drag and drop onto the device as competing products allowed; and Apple kept it that way. These are two examples of rectifying one of the original sins of the computer: forcing users to deal in an ad hoc manner with individual documents as files.

Oh, you might say that it is freedom itself to be responsible for filing away today's expense report as a spreadsheet inside of a folder with 500 not quite identical others, or knowing where each of 10 versions of Margaritaville is located, and which is your favorite. To the contrary, it is just another example of the user being trapped by a lack of imagination on the software architect's part into doing something the computer should be much better at doing.

Or you might say that the individual file is a natural unit of information, like the Boolean bit; and that is not true, the computer file was invented by a man (unless it was invented by Grace Hopper, which I doubt); and computers could have evolved with some other mechanism; perhaps mimicking the human brain which I don't believe uses files.

If I were a bench chemist, and every day I drew some new variant of a steroid, where would I want my drawings, in individual proprietary documents, or in a database giving my work structure and context? Pretty obviously in a database. It creeps up on us, the slow flood of documents; the more organized among us can keep it together through ever deepening trees of folders and files, but eventually individual files in folders becomes unmanageable. I've been using a computer for 25 years, I look forward to at least another 40 years of use. There is no way I will be able to productively keep track of all my creations over that time in a tree structure. As it is, anything older than 5 years might as well be non-existent.

So the operating system vendors introduce search, and we get by like our computers are mini-Googles; as long as we can remember some key phrase we can find it; except when words fail us, or the document can't be parsed by the indexer.

And the cloud beckons. Anyone who thinks the cloud is just a well sorted FTP site, doesn't understand. The iPad is right now at one of those points where Apple can see but cannot provide, or even enunciate the ultimate solution, but does not want users to get into the habit of using individual files for their document needs; so we get the hacky solutions that will be cast aside the moment anything remotely elegant is provided. And I don't know what this solution is either. People have been trying to improve upon the file based system for years; the Newton didn't have files, BeOS had some sort of database file system. OpenDoc tried to get all documents to live together in harmony, etc. And none of these were a market success. Files will not die easily; and you will still have to export to a file for a good long time.

Baring a solution, the point of this entry was just to ask people to keep an open mind about files or the lack of them. Files are not the thing you want; you want to create, edit and view media and documents, and how they are stored in secondary to how quickly you get to them, and how safely they are stored.

Monday, May 24, 2010

Flash's History of Neglect on the Mac - An Example

I turn your attention to this apparently genuine June 18, 2008 forum comment by Tinic Uro, which I will quote:
We have identified the bottleneck in the Flash Player for OSX. Like in the other plugins the culprit is text rendering, in this case rendering using device text. This benchmark spends >50% in a single OSX function: ATSUGetUnjustifiedBounds. You can verify this yourself using Shark. I am working on a change which will cache the results returned by that API to where this call should completely disappear from the performance profile.

If everything turns out well I hope to see that change in future release. Some new numbers on my Mac Pro (note that these represent numbers which can change at any time if we decide to do things differently):

Before the change: ~8.5fps
After the change: ~28fps in Safari (~26fps in Firefox)

This speaks volumes about how little platform specific TLC Adobe historically put into the Mac plugin. A performance bottleneck as glaringly obvious as this should never have been seen by a user, much less bedeviled users and Flash developers for years before someone bothered to run Shark, or now days the CPU Sampler Instrument. Adobe must never have tried to optimize on the Mac at all; if I found a common use case where I was spending half my time measuring the same block of text over and over again, I'd joyously jump on the opportunity to make my user's lives easier because I could fix that in an hour. Apparently, Adobe couldn't be bothered to do the right thing until years later, and countless hours were lost by Flash developers trying to coax the Plugin into looking just as smooth on the Mac as it did on Windows.

And now that Apple is the big dog in the mobile application space, it's payback time for all the years of neglect and shoddiness. And with the eyes of the world upon it, Adobe is having to do the actual heavy lifting of getting decent performance and battery life out of a handful of Android phones just to prove Apple wrong. At least something good might come of this.

In the meantime, I've installed Click to Flash on my Mac and gotten iPad like silky smooth performance out of Safari. it is amazing what all those little ads cluttering up the dozens of open pages I tend to accumulate were doing to my poor MacBook.

Friday, May 07, 2010

Will the Next 13" MacBook Pro have an Optical Drive?

I've been following the news about the new MacBook 13" still having a Core 2 Duo when its bigger brothers have i5 and i7s. Apparently this was because there was not room to fit the added NVidia chipset needed to allow GPU switching. If this chipset problem persists, I'm going to bet that Apple removes the optical drive from the 13" MacBook Pro, giving room for the chipset and a bigger battery, and further I'll speculate that they begin selling a little NAS computer that includes: a shareable optical drive, a centralized iTunes server, a Time Machine server, and AppleTV functionality. This mashup of 4 separate products: Time Capsule, AppleTV, MacBook Air optical drive, and the iTunes library functionality makes a lot of sense, and would be billed as an energy efficient and environmentally friendly way to reduce redundancy. And somehow, this will tie into their cloud strategy which will be coming online shortly. And it would probably have enough GPU powered oomph for on the fly re-compression of video for use on iPhones and iPads.

Wednesday, March 24, 2010

Bye Bye Pentium 4 Server

No sooner had I assembled a mini home theater PC for the TV room, but Amazon Vine offered me a Zotac MAG HD-ND01-U which is basically the same thing in a more appliance oriented packaging.

The guts of my original HTPC will now be moving into the house's new Linux/MythTV server: add a large hard drive, a cheap DVD drive and a larger case. This is going to save me a lot of electricity. A Pentium 4 is a pretty poor choice for a lightweight server, and is now draining over $10 of power a month; the new Atom based server should draw less than a third as much energy and maybe less, as I'm hoping the more modern design will have better idle characteristics. I'll save quite a bit of money in the long run in mothballing my old Dell. And the new Atom/Ion motherboard should be fine for serving the occasional file and recording TV shows.

2.0 GHz Pentium 4 + hard drive + external hard drive = 110 W (or so) at idle
Atom N330 + laptop drive + sleeping hard drive = 25 W idle

If I pay a dollar for ever 10W per month then I'm saving around $8 a month or $96 a year, so the added $180 I'm spending will be paid for in 2 years (or in 4 years if I had had to buy a motherboard and RAM). And I'll have a quiet server that isn't annoying me constantly with its fan.

[Update: It's so pleasantly quiet in my laundry room today. Just have to migrate the svn server on the little NAS to the new server and it's going to be so peaceful in the basement.]

Tuesday, March 09, 2010

Holding onto Legacy PCI Cards is Limiting

After the happiness of putting together an NVidia Ion HTPC for the TV room, I've been looking at replacing the Linux server in the laundry room. It's a typical noisy, energy hog of a Pentium 4 Dell, non-gigabit ethernet, with multiple hard drives either inside it or in an external case. I'd like to replace it with a low power, quiet, modern model. I could collapse my MythTV backend, SVN server, and file server, into one box and save on the order of $15/month in electricity. So, I'd like to move to something like:
  • Dual core Intel Atom N300 processor
  • Ion chipset
  • Small SSD boot drive
  • Large (1.5 or 2.0TB) Green Drive
  • 2 GB RAM
  • mini-ITX form factor

The fly in this ointment is that I have a ATSC tuner on a PCI card in the current box (a very reliable PCHDTV 3000), and getting a mini-ITX Ion with a PCI slot is limiting. As far as I can tell, it is limited to one motherboard: the ASUS AT3N7A-I. And anytime I see customer reviews that include such tidbits as "noisy fan" or "uses more energy than other Ion motherboards" I start wishing for other options.

And the other option is to go with another network tuner. I already have one HDHomerun which along with my PCI tuner gives me the capacity to record three simultaneous shows on the rare days when that's a good thing. Replacing the PCI version with another HDHomerun would open up a larger world of Ion or Atom motherboards with more modern card slots. I even see that Silicon Dust has recently released an economy single tuner model which can be had from newegg for $81.97 shipped. And this will keep the new server cooler and quieter, as well as having another tuner Windows Media Center on my wife's computer can access. Win, win, win.

Tuesday, February 23, 2010

Could Apple Transition to Using iPhone Apps as Safari Plugins?

I've been knee deep in trying to keep an old style NPAPI (Netscape) plugin working under Snow Leopard, and believe me it is not easy. And one of the reasons it is not easy is that Apple has used the 64-bit transition to lock down what an NPAPI plugin can do and when it can do it. No longer are we allowed to do things like bring up our own windows, draw whenever we feel like it, create our own timers, track our own mice and generally do things the easy way; we have to live in the browser and follow the browser's rules.
I refer you to the Web Plugin Programming Topics Guide
Beginning in Mac OS X v10.6, on 64-bit-capable computers, Netscape-style plug-ins execute in an out-of-process fashion. This means that each Netscape-style plug-in gets its own process separate from the application process. This design applies to all Netscape-style plug-ins. It does not apply to WebKit plug-ins, nor at present to WebKit-based applications when running in 32-bit mode.)


It goes on to enumerate a large number of new restrictions on how plugins are supposed to operate in their new out of process home. At this point, it becomes pretty hard to justify using a NPAPI style plugin for any but the simplest plugin for Safari; it is just too strangling. So the mind turns to WebKit plugins which are basically Cocoa NSViews with a few limitations and added message handlers. As the quote above says they are not (at present) limited in such a way as Netscape plugins; in this way Apple is nudging people to use WebKit plugins if for no other reason that they allow you to do things that the NPAPI plugin do not.

And here we come to the idea for this posting, it seems as though Apple is interested in locking down and rationalizing Safari plugins. It is also reasonable that at sometime in the future, they will allow plugins for Safari on the iPad. It is also almost a given that those plugins will not be NPAPI style plugins, but will instead be UIView or UIViewController derivatives. And at that point, the question becomes, wouldn't the UIView style plugin with its simpler and more modern object design make a cleaner foundation for plugins for Safari on the Mac? Apple already has an iPhone simulator on the Mac. Of course, there would have to be some extension allowing mouse hovering events.

    Advantages
  • Code sharing between Mac and iPad
  • Plugins are closer in scope to iPhone Apps then full blown Mac apps.
  • At this point most Objective-C programmers are iPhone developers so developer pool would be larger
  • Expand iPhone apps to the Mac
  • iPhone Apps are used to being sandboxed

Anyway, just a thought.

Thursday, February 18, 2010

Inexpensive Ion HTPC

As I want to improve my free MythTV remote, and I needed to "eat my own dog food", I decided to build a HTPC box for the TV room to run the MythTV frontend. I'd been interested for a while in a small, low power PC based around the Nvidia Ion chipset. A little research indicated the board to get was the Zotac IONITX-A-U as it:
  • Came in a compact, mini-ITX form.
  • Has the 64-bit capable dual core Atom 330N processor
  • Has multiple output ports including VGA, HDMI for video and SPDIF coax and TOSLink optical for audio, amongst others.
  • Gigabit Ethernet
  • External power supply to keep the heat out of the case.

The most important feature was quiet, with the related property of energy efficiency. There are many cheap PCs out there, but few can play 1080i MPEG2 videos while draining 31 Watts.



I put in 2GB of RAM which allowed me to configure the BIOS to set aside half a gig for video memory. I removed the included wireless card, as I have Gigabit ethernet in my TV cabinet, and there's no reason to waste whatever minimal energy would be used by the card. Threw in a 60 GB laptop drive I had from a few MacBooks ago, and put it in a small case. The hardest part of installing MythBuntu 9.1 64-bit, was scrounging up an external optical drive. So, all in all a very easy thing to put together.

It works fairly well. I've put my children's DVD collection on a network share and use MythVideo to watch that and recorded TV shows. Since this was my first real standalone MythTV frontend, I had to rejigger my video setup such that both the backend and frontend used both the same relative paths to videos and movie posters (both Linux boxes have paths of the form /mnt/MyNAS/MyVideos and /mnt/MyNAS/MyPosters). Once this was all setup, it's extremely convenient having a kid's movie a few touches away. I had read complaints about the fan needing to be run at a lower speed, but I really can't hear it from the couch.

    Energy usage measured via Kill-A-Watt
  • Idle Running MythTV Frontend: 25W
  • Playing DVD Image: 26W
  • Playing 720p MPEG2 (recording of 24 on Fox): 28W
  • Playing 1080i MPEG2 (Big Bang Theory on CBS): 31W
I have got to figure out how to get the box to put itself to sleep when I'm not using it.

My old LCD TV is impossible to get the VGA just right (no surprise that) so I have a two inch black bar on the right of the display, but I'll be moving to use the DVI port as soon as Monoprice gets me an HDMI switch.

It's interesting as a Mac guy seeing this little DIY assembly. On the one hand, it's pretty inexpensive and I wouldn't want to waste a Mac on single purpose computing. On the other hand, it's pretty darn cheap. Here's a photo comparing a older generation Mac Mini with my system.


The Mini is definitely a superior computer in terms of build quality. The M350 case is nice enough, but it is still ill fitting sheet metal, and the ports still wiggle and flex when you try to push a connector in. In comparison, the Mini is just rock solid, and smaller while still having a faster processor and an optical drive. And it's a darn sight more attractive. On the other hand, the Zotac has a ton of extra ports, including a variety of internal SATA and USB connectors. Different needs.

Cost:
  • Motherboard: $185
  • RAM: $45
  • Case (with Shipping): $50
  • Hard Drive: Free
  • --------------
  • Total: $280

Monday, February 01, 2010

A Cell Phone Game for Children

I was playing hide and seek with my 2 and 3 year olds, and came up with this variation.

Requirements
2 Cell phones, one of which with a speaker phone mode.
Instructions
1) Call the phone with the speaker phone mode, answer and put it in speaker mode.
2)Have someone hide the phone in another room.
3)Let the child try and find it by yelling into his phone.