Sunday, December 21, 2008

MythTV, HD-PVR, Trunk

Until Hauppauge released the HD-PVR this past May, there was no way to record in high-definition from your cable or satellite box. Your HD recording options were limited to over-the-air, unencrypted QAM, or, in rare instances, unencrypted firewire from the cable box. In any case there was no way to record the good stuff in HD. I'll spare the details about the HD-PVR itself, and instead include some links at the end.

Of course, Linux support for the HD-PVR wasn't available straight away. However, this is a big breakthrough for homebrew PVRs so the MythTV/Linux community has been hard at work. It there is a Linux driver for the HD-PVR, but you have to check it out and build it yourself (See links at the end). Also, early work has been done on MythTV to support the HD-PVR, but in order to get that support, you have to run MythTV from trunk, rather than using your distribution's packaged version. This means checking out source and building it yourself. The source code changes daily, and at times things get worse before they get better. Whatever you check out on any given day should be in decent shape, but it may not be. You are forging MythTV from the molten center of the Earth, and it's still hot and a little bit on fire.

What is now in trunk will eventually make it into your distribution, but there's generally a 6-12 month lag, depending on your distribution's release cycle. Well, I'm ready to give the HD-PVR a try now, so I decided to give trunk a go. I set up a spare machine on my work bench to be a frontend and backend. After getting the HD-PVR drivers compiled and working, I checked out MythTV from subversion and built it. There are good documents detailing how to do this. It's actually not that hard. The build system for MythTV is very nice. I recommend starting with a MythTV based distribution like Mythbuntu, and removing all the mythtv packages. This will save you a lot of grunt work like setting up MySQL, the mythtv user, init script, etc.

Anyway, I've got the HD-PVR working with MythTV. I haven't used it for any actual TV watching or recording yet. As soon as I can get a frontend set up, I'll be able to actually try using it day-to-day.

Relevant links:
http://www.geektonic.com/2008/06/hauppauge-hd-pvr-unboxing-first-look.html
http://mythtvnews.com/2008/05/30/hauppauge-hd-pvr-support-coming-for-mythtv-devices-are-shipping-now/
http://mythtvnews.com/2008/06/14/hauppauge-hd-pvr-linux-driver-released-alpha-quality/
http://www.mythtv.org/wiki/index.php/Ubuntu_Installation_Guides
http://www.mythtv.org/wiki/index.php/Hauppauge_HD-PVR

Ditching apt-cacher-ng for Squid

apt-cacher-ng hasn't exactly worked out as well as I had hoped. I kept having a problem where the Releases file from the apt repository would be reported as corrupted. I could go into the cache directory and manually remove that file, forcing it to be downloaded again, but that would only help with the system I was updating at the moment. The next system that I tried to update would have the same problem.

I've read a number of articles about just using Squid as an apt cache. I avoided this at first because Squid isn't really made for that. For example squid has no way of knowing when a specific version of a package has been made obsolete by a newer version. I assume apt-cacher-ng and apt-proxy know how to do this (maybe not). Also Squid isn't intended to cache arbitrarily sized objects for indefinite periods of time. I decided to give it a try though, since a number of people seem to be having success with this.

I had to tweak Squid's default configuration a bit to make it suitable for apt-caching.
The Squid configuration file is well documented, but below are the directives of interest:

refresh_pattern deb$ 1577846 100% 1577846
refresh_pattern Packages.gz$ 1440 100% 1440
cache_dir ufs /var/spool/squid 15000 2 8
maximum_object_size 409600 KB

The first line says to cache anything ending ('$' is the end-of-line anchor) in "deb" for 3 years. There are some packages that rarely get updated if ever, so I want to make they stay in the cache the entire time I'm using a given distribution release. I figure I'll probably be on a particular distribution release for no longer than three years.

The second line says to cache anything ending in "Packages.gz" for one day.

The "cache_dir" line says to put Squid's cache in /var/spool/squid (I believe this is the default). It also says to let it grow no larger than 15,000 megabytes. My system partition gets very little use, so 15Gb is no problem for me. The second two numbers tell Squid how to structure the cache. "2" says to create two level 1 directories and "8" says to create 8 level 2 directories. The default is 16 and 256. I read an article where the author was having a problem with the hard drive never spinning down because Squid was rescanning the cache every few seconds. The author said reducing the number of L1 and L2 directories helped. If anyone can find this article, please post a link in the comments.

The "maximum_object_size" tells squid to cache objects up to 400Mb. There shouldn't be any debs even close to that large. Squid's default is much smaller than this.

In order to get apt to use the proxy, I created a file called "proxy" in /etc/apt/apt.conf.d/. You can call the file whatever you want. The file contains the line:

Acquire::http::Proxy "http://hoth:3128";

"hoth" is the hostname of the Squid server. Make sure there are no "Acquire" directives in any other apt configuration file:

/etc/apt/ # grep -rn Acquire *

I discovered that on some systems there was an "Acquire" directive explicitly telling apt not to use a proxy.

So far, Squid has been working pretty well. There was one instance where apt failed to completely download a package, but after running apt-get dist-upgrade again, the package successfully downloaded.

Tuesday, October 07, 2008

FiOS

We just got our FiOS internet hooked up. I opted for the 20Mbs up, 5Mbs down option. Here's a bandwith test from speedtest.net:Other than being really annoyed at how difficult (i.e. almost impossible) Verizon makes it to use your own router instead of theirs, it's pretty awesome. I'm in the process of wget-ting the internet as we speak. Starting with .com.

Wednesday, October 01, 2008

Awesome Log Message

I saw the greatest log message in my MacBook's system log this morning.

SyncServer[8551]: SyncServer: Truth vacuumed. Next vacuum date 2008-10-15 08:13:17 -0400

I'm sure it's something normal, and, given appropriate context, would make perfect sense. But I am just amused that the SyncServer is vacuuming up all the truth. No wonder I have a hard time finding useful log messages when I'm trying to troubleshoot something.

Tuesday, September 09, 2008

apt-cacher-ng

I have several machines on my network running Ubuntu Linux which need to be updated regularly to keep up with security updates and the like. Several months ago I decided to try using a caching proxy to get updates from the Ubuntu package repositories in order to reduce load on the Ubuntu servers as well as to speed up the updating process on my systems. After spending some time using apt-proxy, and being generally dissatisfied with its reliability, I decided to give apt-cacher-ng a try.

Unfortunately apt-cacher-ng's documentation is a bit inaccessible. Aside from not being available on the project's website (it's in PDF form in the documentation directory after you install it), the documentation, although complete, is kind of impenetrable. There also really isn't anything that I could find along the lines of HOW-TOs or FAQs on the 'tubes. I eventually got it worked out, and, at least for the common case, it turned out to be more straight-forward than the documentation leads you to believe. Here's how I set mine up.

I put entries in my acng.conf like the following:
Remap-ubuntu: http://us.archive.ubuntu.com/ubuntu
Remap-medibuntu: http://packages.medibuntu.org/

The way this works is you specify names for each Remap-[whatever] where [whatever] is whatever you want to call it--they just have to be unique. Then the part after the ':' is the literal URL of the repository you're caching including the directory path on that server, if there is one.

The documentation describes a third part that follows a ";". If your second part is the actual URL of the repository, you don't need the third part, as you see above.

In my sources.list, I put
deb http://my-server-name:3142/packages.medibuntu.org hardy free non-free
deb http://my-server-name:3142/us.archive.ubuntu.com/ubuntu hardy main restricted universe multiverse
Here, the URL to the apt repositories is your local server name (port is 3142 unless you change it in acng.conf) followed by a directory that is the exact same as whatever the fully qualified domain name of the real apt repository was, followed by the actual directory path that you appended to the actual apt repository in your acng.conf.

There are lots of other options described in the apt-cacher-ng documentation, but I believe this is the common case, and it's working fine so far.

Update:
The directory structure that gets created under /var/cache looks like:

apt-cacher-ng
|-- canonical-partner
| `-- dists
| `-- hardy
|-- ubuntu
| |-- dists
| | |-- hardy
| | |-- hardy-backports
| | `-- hardy-updates
| `-- pool
| |-- main
| |-- restricted
| `-- universe
`-- ubuntu-security
|-- dists
| `-- hardy-security
`-- pool
|-- main
|-- multiverse
`-- universe

Tuesday, January 01, 2008

Sign Up for Cyber

CYBER APT WEBINAR