A young guy just walked into our office. I initially thought he was selling
someone, but that turned out to be wrong. The guy said that he was using our
open WiFi, and asked if we minded. We looked at each other, shrugged, said no,
as long as he didn't use too much bandwidth and didn't do bad things like
sending spam (we pay a flat rate, and use ssh/ssl for everything).
Then the guy asked for some technical help -- in particular, he asked for
an outgoing SMTP server that he could use. He mentioned that he lives in a
building across the courtyard and that the connection was a bit weak, and did
we mind if he set up a repeater in his flat?
It turned out he was a designer, studying cinematography.
It was a little bit weird.
I upgraded to Breezy
yesterday morning (1 hour dist-upgrade, 1 hour patching and compiling a kernel
to fix radeon
power drain bug, 2 hours figuring out why
Firefox won't start).
I decided to try out Beagle.
sudo apt-get install beagle
This is all it takes to install it, run the indexing daemon, and run the
search utility that sits in a tray icon. (I had enabled user_xattr in
my /etc/fstab a while ago.)
I was impressed by the background indexer -- it really is extremely
nonintrusive. No slowdowns, no excessive disk I/O. I couldn't notice
it was running in the background. I left the laptop running overnight
and went to sleep.
In the morning I discovered that mono-beagle daemon ate 300 megs of virtual
memory (all that was left, and then some -- the laptop started swapping). What
is worse, it ate all the remaining disk space (I had about a gig left).
~/.beagle/ eats 500 megs -- the rest are probably metadata in extended
attributes, scattered all over the place.
How do I measure the disk space taken by extended attributes? How do I
How do I discover how complete the index is? I presume beagled knows which
files it has already indexed, and which it still plans to index in the
What are the RAM and disk costs of indexing 20 gigs of data (that's my ~ for
you)? Will beagled eat inordinate amounts of RAM only during indexing, or
I only have 512 megs of RAM in this laptop; I do not want to sacrifice 60%
of that to beagled. Likewise, 1 gig out of a measly 40 gig disk feels like
a lot to pay for some convenience. I think I shall go back to locate +
recursive grep for now. Or disable blanket indexing and only ask Beagle to
index a few subdirs. I don't really need an index on all those Zope 3 and
SchoolTool source trees.