The Linux box in the basement runs the MythTV backend. I can watch live HDTV wirelessly using 802.11g networking through the Linksys router in my back bedroom to my MacBook. As long as the microwave isn't on, and I stay within 30 feet or so of the router, the picture and sound are typically great.
I have an HDHomeRun networked HDTV tuner right next to the Linksys router, and I cannot watch 5 seconds of TV wirelessly without a skip, big pixilation, or pop using the EyeTV application. Using wired ethernet it's fine, but 802.11g? Forget about it.
I can set the MythTV to grab a stream from the HDHomerun, bring it down to the MythTV via wired ethernet, and then back up the same wire to the router, and out as radio waves, and it still works smoothly. I just watched 20 minutes of 1080i HD golf on CBS without a skip; although twisting the MacBook around will cause the occasional skip at this distance—perhaps 25 feet through a floor and a couple walls.
Why can MythTV do what EyeTV can't in this situation? I don't know; presumably it has more flexibility in buffering at the source, but it still has to average the same bit rate. MythTV might be better at handling lost packets... Again, I don't know. I just know MythTV works better for wireless HDTV.