Tuesday, September 25, 2007
Very non-intuitive error message from SVN
sh: ./svn-commit.tmp: Permission denied
svn: Commit failed (details follow):
svn: system(' svn-commit.tmp') returned 32256
It sounds like a file permission problem, but I couldn't find any problems - I was working in my own directory that I checked out of SVN. I ran strace to see what was going on, and right about the time that the Shinola hit the fan, I could see that it was trying to fire up an editor for the check-in comment. I looked, and I did not have the EDITOR environment variable set. Once I set it, it worked fine.
Let's see, "Permisssion denied" means that an environment variable was not set. Sure, that makes sense - NOT.
enjoy,
Charles.
Tuesday, September 04, 2007
Getting Maxtor OneTouch III to work on MacTel
It include a button on the front that you can program to do various things, the most obvious being to initiate a backup. But it didn't work; I'd push the button and nothing would happen. After poking around, I found this update for Intel Macs. After installing it (and rebooting), it works like a charm.
enjoy,
Charles.
Wednesday, August 29, 2007
What's in an OS X Package (.pkg) file?
The original motivation was that I wanted to uninstall a package that I'd installed. The package file is Subversion-1.4.4.pkg. The first thing I learned is that packages are another case where a directory (aka folder) shows up as a single file in the Finder. You can see the contents in the Finder by control-clicking on the package file and selecting Show Package Contents. The Finder opens up a new window (just like any other folder).
All I see in my package is a folder called Contents. Within that folder, is a file called Archive.pax.gz, which is a compressed pax archive. Pax is an archive program like tar and cpio. (Actually, it's an experiment on genetic engineering - in order to "solve" the tar vs. cpio wars, they merged the two and called it pax, Latin for peace.) The pax archive was subsequently compressed with gzip - hence the gz extension.
Too see what's in the archive, we need to decompress it and get pax to print a listing of the contents. I did this as follows, although there are other ways to skin this cat:
cd /tmp
gzip -d < Subversion-1.4.4.pkg/Contents/Archive.pax.gz | pax -v
(Pax contains an option to do decompression, but I'm too old fashion.) Note that I redirect the compressed archive into gzip. I did this because I wanted to leave the archive compressed. I could have decompressed the archive and then ran pax on that, but I wanted to leave the entire package unharmed. The first bit of the output looks like:
-rwxr-xr-x 1 root wheel 1664424 Jun 22 23:57 ./usr/local/bin/svnadmin
-rwxr-xr-x 1 root wheel 1571744 Jun 22 23:57 ./usr/local/bin/svndumpfilter
-rwxr-xr-x 1 root wheel 1664612 Jun 22 23:57 ./usr/local/bin/svnlook
So, we can see that these are the files that got installed (in /usr/local). One nice thing to note is that the path names are all relative - they begin with "./" rather than just "/". This means the files can be installed anywhere - change into a directory, extract the archive, and the files will show up in a directory called usr/local below your current directory.
How can I use all of this to uninstall these files? Again, there are many ways to skin the cat, but here's what I did. I extracted the archive to /tmp. ("I thought you wanted to remove the files. Why are you extracting them again?" Patience, my friend.)
cd /tmp
gzip -d < Subversion-1.4.4.pkg/Contents/Archive.pax.gz | pax -r
This creates all of the files, but under /tmp - e.g., /tmp/usr/local/bin/svnadmin. I can now use find to get me a list of just the files:
find usr -type f -print > /tmp/fff
I then looked through the file names in /tmp/fff, and they made sense, so I removed them all.
sudo rm -i `cat /tmp/fff`
The sudo command was needed because the files were not owned by my user ID. The package installer asked for the administrator password and installed the files owned by root (as I recall). Of course, I could have avoiding "installing" the files in /tmp by running the output of pax -v through some awk or perl, but the archive was small, and I knew the options to find off the top of my head - I would have had to look up some awk or perl, since I don't use them that often any more.
enjoy,
Charles.
P.S. This post almost never was: Blogger mangled the fonts repeatedly, and I almost gave up on it. Stupid JavaScript HTML editors!
Wednesday, August 22, 2007
Everything I Know About Business I Learned from My Mama
Although this book isn't really a how-to manual with lots of nuts-and-bolts details. It would be a really good book for someone who is thinking of going into business for himself. It even begins with a bunch of reasons why someone shouldn't go into business. If after reading the book, someone still wanted to go into business, he should probably read a few more books before jumping in. (The E-Myth Revisited: Why Most Small Businesses Don't Work and What to Do About It is next on my list to read.)
The book is pretty short and humorous, and the chapters are nice little bite-sized chunks. Here's an attempt as his sense of humor: this would be a great book to keep next to the toilet - each chapter is about one visit long. (I have a special place in my heart for books like that.) I suspect the book was easy for Tim to write since he's been writing newspaper articles for some time. I can easily imagine that many of the chapters are extensions of articles he wrote for the paper, bless his heart.
One minor nit I could pick with the title is that there isn't all that much mention of stuff his mama told him. Rather, there's a fair amount of common sense that one's mother might impart.
I bought this book because I've been listening to Tim on Dan Miller's radio show - see www.48days.com. Dan is a career coach and wrote the book 48 Days to the Work You Love. The two of them spend a lot of time telling people to quit jobs they hate and move towards work they love. The radio show recently ended, and they've switched to an Internet format. The radio shows and the new Internet shows are available as podcasts.
enjoy,
Charles.
Roomba Rants
Our house is a regular petting zoo, so stuff accumulates pretty quickly. This fills up the bin, which isn't so bad because emptying the bin is quick and easy. But the human hair and pet fur fowls the brush, and that takes some serious effort to clean. I suppose the Roomba for pets (which appears to be discontinued) addresses this with a brush that is comparatively easy to clean.
And then last night I discovered a new place to clean: the side sweeper brush. That thing was totally fowled (not surprising considering I just discovered that I needed to clean it - "read all the words"), and it took me 10 to 15 minutes with my pocket knife and tweezers.
I find it ironic that a cleaning tool requires a non-trivial amount of cleaning. Perhaps if I kept up with regular vacuuming, Roomba wouldn't get so overwhelmed by crap.
Finally, the reason I'm just getting back to using Roomba is that I just got a new battery after killing the old one some time ago. It turns out managing batteries with Roomba is a bit different than other devices I've used. Typically, I run devices down and recharge them. If I don't have an immediate use, I'd leave the device uncharged. Well, Roomba keeps discharging the battery even if it's not in use. And if you do that to a battery that's already low, it deeply discharges the battery - definitely not A Good Thing. I noticed the discharging thing: sometimes when I'd charge the battery and not use Roomba immediately, the battery would be low when I went to use it. So, Roomba prefers being left on the charger, even if you're not using it, which is something I tend to avoid with other devices. Anyway, the result was that I only got ~50 cycles out of the original battery - not good. The recording on Roomba's tech support tells you if you're not using the unit anytime soon, you should take the battery out. Fortunately, it's easy to install and remove the battery - e.g., no screws required.
Don't get me wrong: Roomba is still fun to use, and I did just put down more money for a new battery. So, I will continue to use it. It's just not as carefree and easy as I'd like to see it.
Later,
Charles.
Friday, August 17, 2007
New iMac
My original plan was to get a Mini and use a KVM along with the PC boat-anchor that I keep around for a client. However, after using a KVM for a while with other PCs, I realized that KVMs can be kinda hokey. For example, my KVM makes a keyboard look generic - i.e., it hides any special keys. Also, although I have a cool 22" Viewsonic LCD, it's not as nice as an Apple display.
I opted for the 20" 2.4Ghz model with the stock 1GB of memory. I was very pleased to see how easy it will be to upgrade the memory when I get a few extra dollars. The 24" was pretty tempting (the price is quite reasonable), but I figured if I ever need a larger display, I can use the external video connected to my (somewhat inferior) Viewsonic for a ~40" display experience.
The CPU is hella fast, although I admit I'm comparing it to an 800 Mhz G4 iMac which was significantly taxes (~20% CPU) running Firefox and Gmail. To date, the only time I'm maxed out the two cores was converting audio files into MP3 with iTunes. My Internet is still only 128K ISDN, which makes for slow Gmail, but Steve Jobs can't be expected to fix that.
The new, thin keyboard is something that cannot be appreciated in a store standing over it. You have to sit in front of it with a real chair in a real working position to appreciate the nice ergonomics. My one gripe is that my thumb drive is too fat to fit in the USB sockets.
And finally, the Apple educational discount and promotions were sweet. After the rebates, I'll have a free Nano and printer, both of which I gave to the wife - score!
Anyway, Joe Bob says, 5 stars (on a scale of 4) - check it out!
enjoy,
Charles
Wednesday, June 13, 2007
Google Docs for Family Support
My sister and I have been in the difficult position of trying to coordinate care from far. We both work full-time, and so we tag-team on calling people. As we call various professionals and friends, we learn new names and phone numbers of more people to call. (Sounds like Amway, but it isn't. :-)
Anyway, I set up a Google spreadsheet with the names and numbers I was working from and shared it with my sister. She added the information she had collected. We also have columns for email addresses (where relevant) and notes about the person (e.g., a friend from church, the social worker from the hospice, etc.).
The collaborative feature of Google Docs (and spreadsheets) is a life saver. And, if you're away from the Internet, just print it out and take it with you - how 20th century. Joe Bob says, four stars - check it out!
enjoy,
Charles.
Thursday, March 01, 2007
Book: Windows Forensics and Incident Recovery
by Harlan Carvey -ISBN 0-321-20098-5
This is a great book because I learned more than I thought I would from it. Coming from a command-line Unix background, I tend to view Windows as excessively GUI-centric (maybe that's why it's called Windows?) and full of opaque Microsoft voodoo. This books showed me that there are plenty of things to be learned from the Windows command line, and there are lots of transparent, open-source tools to expose the inner workings of Windows.
There are really three types of information in this book: how Windows works, tools to collect information about Windows, and the bigger task of forensic information extraction and processing. There is a lot of information about basic operating systems concepts (files, processes, etc.) and how they are implemented in Windows. I especially liked the presentation of user privileges - we typically only hear about those in the context of administrator versus non-administrator, but there is a listing of each of the individual privileges and what they mean. The tools that the author presents are primarily command-line tools, and many of them are written in Perl - very approachable for an old Unix hack. (A second edition of this book would benefit from a treatment on Microsoft's WMIC tool.) With the basic groundwork laid, the author presents a bigger picture of how to use all of the tools in a forensic investigation. He presents a series of dreams, which are a bit corny, but they serve as a sequence of case studies. He also provides a "forensic server" to storing all the little bits of information that get collected - a bit like "real" tools like EnCase.
Like the book File System Forensic Analysis, one of my favorite aspects of this book is that it provides a lot of practical information about applied operating systems - Windows. The author provides links to a lot of tools and web pages, so this book serves as an excellent starting point to learn a lot more about Windows and forensic data recovery. The text includes complete source code for the Perl tools, so a code-oriented reader can really see what the information is and where it comes from.
If I had to criticize something in this book, I'd say that Chapter 9 on scanners and sniffers drifts a bit from the central theme of the book, but then I've found that to be pretty common in security books because so many of the topics are interrelated; you start pulling one thread on the sweater, and the next thing you know, you've unraveled the whole thing.
All in all, this is a great starting point for learning about forensic data acquisition on the Windows platform.
Enjoy,
Charles.
Tuesday, January 09, 2007
Google Docs - Quick and Dirty Wiki
Google recently bought out JotSpot before I discovered it. It sounds like a cool idea - they set it up for you, and I guess it could be configured to be totally private, which would eliminate the spam. I'll be very curious to see what it looks like when it comes back online.
In the meantime, I've discovered Google Docs and Spreadsheets as a collaborative tool. At first I thought, free Word and Excel - who cares? I've already paid The Evil Empire for my software. But when you add the Internet storage and collaboration features, it becomes very cool. A shared document becomes a wiki page!
I see two really nice features with Google documents compared to run-of-the-mill wikis: the formatting and editing is a word processor (implemented in Ajax), not another markup language, and the sharing is on a document-by-document, user-by-user basis. In a small team, it's nice to allow Bob to see the document and Jane to edit it. On another document, they can both edit it. Of course, in a large team, configuring this one-by-one would suck.
Another nice feature is that you can upload documents (from Word, OpenOffice, others), so someone can begin something with a word processor (maybe when s/he is offline) and convert it trivially into a pseudo-wiki page. Sure you can cut and past from Word into a blog or wiki, but I hate the way some characters get mangled in the process.
The Google spreadsheets are nice for collaborative project management. I've been using Voo2do for managing simple task lists. It's nice, but the collaboration options are limited: you can share a password-protected, read-only view, but to allow someone else to edit, as near as I can tell, you have to grant them access to your whole account - not just the one task list you wanted to share.
With Google spreadsheets, all you have to do is create a spreadsheet with the tasks (ala Joel on Software), and share it, either read-only or read-write.
I'm currently working on a small project with two other people, and I've just gotten into this. So far, it's great. More news when it happens.
Enjoy,
Charles.
Saturday, November 11, 2006
Core2 Mac Mini
Apple hints at Core 2 Duo Mac mini?
How Soon Will the Mac mini Go Core2?
Personally, I'm dying to see a Core2 Mini. At the moment, I'm stuck with a 800 Mhz G4 iMac that just doesn't cut it any more, but I can't afford a 20" Core2 Duo iMac.
For the sake of argument, why would Apple not update the Mini to the Core2? Part of that depends where Apple really sees the Mini. When the (G4) Mini first came out, one of the pitches was for PC developers to use it in addition to a PC via a KVM switch. In this scenario, it made sense for Apple to make it beefy.
When the first Intel Macs came out, the Mini had a Core processor (I'll call the Core processor the "Core1" just to be extra clear) just like the other Macs, in particular the iMac. They wanted the Core archiecture, but there weren't many of those processors, so the Mini and the iMac were pretty similar. Then the Core2 came out and Apple bumped the iMac to a Core2 but left the Mini at a Core1.
I take this to indicate that Apple is trying to create some diversity in the product line - i.e., the need to differentiate the Mini from the iMac. Thus, they need to keep the Mini crippled, and they're no longer pitching the Mini as a developer machine. This is (unfortunately) like the IBM PC Jr back in the day. Now, they have bumped both Mini models to a Core1 Duo, but it's still a Core1 not a Core2 - i.e., still crippled.
The other possible motive (for Apple) of keeping the Mini at Core1 is profit margin. When the Core2 came out, Intel slashed the price of the Core1, but Apple has not dropped the price of the Mini. If Apple had a big stock of Core1 processors (especially Core1 Duos) when the Core2 came out (not likely given how shrewd Apple often is), this gives them a way to flush their Core1 inventory. More likely, Apple is just making bank on the reduced cost of the inputs.
So, I can see a few reasons why Apple may not bump the Mini to a Core2 very soon. I hope I'm wrong it because I'd really like a Core2 Mini. The MacWorld release timeframe that is being rumored would fit my budget very nicely.
enjoy,
Charles.