Here’s what I’ve just sent to the London Decompression leads’ list; sadly, sometimes we have to take a stand for what we believe in

Following on from information about the proposed venue (‘Cable’) for this year’s London Decompression, their insistence on using “Clubscan” and my principles, I feel I can no longer be involved, or participate in this year’s London Decompression event.

Should there be a change in their deployment of the Clubscan products/services (or any such similar ones), I may be able to reconsider this.

I’m really quite disappointed that, as a group, we’ve not – to my knowledge (nothing’s been mentioned about it) – regarded privacy as a concern, or carried out (for example) a Privacy Impact Assessment.

Whilst many folks are content to abrogate their privacy, for some of us, it’s been an issue that we – and our forebearers – have fought for – and are still fighting: including taking matters to the courts.

It’s through privacy campaigning, that I became introduced to the Burner Community (as strange as that may seem). I know I’m not the only privacy-aware/concerned individual from amongst our community.

That’s even before the issue of bringing out passports/photo ID, let alone surrendering the data.

I should make it clear that, whilst I am a fairly well documented privacy campaigner, I’m far from the irrational knee-jerk contingency; I review matters and thence form rational decisions.

The whole concept irks me immensely, putting me at ideological differences with the rest of you.

In the spirit of collective decision making, I no longer wish to take a part, in the organizing, or the participation, sadly, of this year’s London Decompression. I know of friends who will not be participating, too.

As such, I have no choice but to step-down with immediate effect. Please do not contact me regarding anything relating to the London Decompression, unless it’s to say the Southwark Council have rescinded, and that the council/licensing board/venue chosen takes privacy seriously, and ceases to insist on the installation and use of any of the “Clubscan” sort of products. The whole principle – and their “argument” for the deployment, – does not, to me, hold water.

For the time being, whilst I further evaluate my feelings/options, I will not be rescinding the use of my domains. However, should this contemptuous disregard of privacy continue, the offer will not be available again.

I honestly wish that I could offer my hopes for the best, but under these circumstances, and my hardly concealed views, I don’t feel able to offer this. Particularly as it supports a company that, in my view, pushes the limits of the spirit of the law, as well as promulgating a ‘papers please’ culture: and that’s just to ‘have fun’.

I’ve recently (ish) started using transmission as my torrent client; the change-over comes from my switching-things-off approach; instead of keeping ktorrent running on the laptop (and caneing my bandwidth), I can have something mainly work on the NAS which is always on (bar power-cuts/maint).

One of the things that I noticed was the apparent lack of renaming within Transmission (and the curious way earlier tickets are marked as duplicitous of later ones).

So, erm, I’ve written something that works for me. And hacked out the mailer-script to something a little cleaner— at least in my view.

The premise is that you’re using a POSIXish operating system — my NAS runs on Debian — and that all of your exports are within the /nas directory, and your torrents directory is /nas/torrents.

The changes needed are (with transmission not running, apparently) to include /path/to/post-download as the value for
script-torrent-done-filename in settings.json

You’ll need to echo in "/path/to/store/the/completed-file" to a file named as per the torrent (see your incomplete directory for that), but with “.move” appended; the rest should all happen automagically.

You might find it useful to chown the directories you’ll be moving things to that of the user running the transmission processes; I tend to setgid to my GID too.

The other file, mvtor, is one for doing a manual move, specify the torrent as an arguement; e.g., ./mvtor "ubuntu-10.04.1-alternate-i386.iso"

I thought other listadmins might be having fun with gmail now being available in the UK (rather than “googlemail”, as it has been for a while (despite ‘gmail’ originally being available, back in the days of invitation only)), and thought I’d share my hackish way around this, so listfolks can post from their gmail.com addresses.

It’s not pretty, but works for me — pre-requisite, Mark’s very useful “non-members” script: http://www.msapiro.net/scripts/non_members

mkdir ~/tmp/gmail

#find who you need to work with:  
list_lists -b | while read L
    do
        list_members ${L} | grep googlemail > ~/tmp/gmail/${L}
    done

#Zap annoucement lists from the files, remove empty files, too.

#Let them post!  
/var/lib/mailman/bin$ ls -1 ~/tmp/gmail | while read L
do 
    sed 's/@googlemail.com/@gmail.com/' ~/tmp/gmail/${L} | while read X
        do
            ./non_members --list=${L} --filter=accept --add ${X} --verbose
        done
done

(nb: the path (/var/lib/mailman/bin) is from a Debian machine — Mailman installed via packages — and in my case /var/lib/mailman/bin being in
my ${PATH} — so replace those as appropriate in your cases.)

Which seems to have done the trick.

Ever wanted to know who the OEM/Supplier/Manufacturer of network devices attached to a machine were?

I did. And couldn’t see anyone else’s script to steal, so here’s a really ugly way to do it:

# arpinfo:
#   pull hardware info from the arp() table
#
# Copyright (c) 2010 Adam McGreggor. Some rights reserved.
# Email: <adam@amyl.org.uk> Web: <http://blog.amyl.org.uk>
#
# $Id:$
#
WEBSOURCE=http://standards.ieee.org/regauth/oui/oui.txt
DOC=/usr/local/doc/oui.txt
curl --silent ${WEBSOURCE} -o "${DOC}"
arp | awk '{print $3}' | awk -F: '{print $1"-"$2"-"$3}' | while read ARP
do
    grep $ARP ${DOC}
done
arp

Works for me… although it could do with a tidy-up. As a quick and dirty thing, mind…

title: Firefox Extensions
author: adam
layout: post
permalink: /2010/01/firefox-extensions/
categories:
- bash
- indolence
- machine set-up
- scripts
- software
tags:
- bash
- geek
- install
- new machine
- scripts
- set-up
- tech

Thought this might double up as a note of the firefox extensions I currently have installed — I’ve tried getting this to script, but, the source file isn’t something I’m over-familiar with, and getting fields to match-up ain’t happening, due to my crapness.

Anyhow, I would appear to have these firefox extensions installed:

A few of those don’t have links I can identify from the URI.

Want some code that vaguely does this for you?

#!/bin/sh
#
# ffexts:
#   list firefox extensions: names and URIs for download/homepage
#
# Copyright (c) 2010 Adam McGreggor. Some rights reserved.
# Email: &#60;adam@amyl.org.uk&#62; Web: &#60;http://blog.amyl.org.uk&#62;
#
# $Id: ffexts 119 2010-01-10 00:38:04Z adam $
#
set -e
MOZDIR=~/.mozilla/firefox
PROFDIR=`ls -lha ${MOZDIR} | grep default | awk '{print $NF}'`
FILE=extensions.rdf
INFILE=${MOZDIR}/${PROFDIR}/${FILE}
OF=~/tmp/ffexts
OUTFILE=~/pseudohome/nas-docs/firefox-extensions-$(date '+%Y%m%d')
# check for existing outfile, as we'll be
# appending; if so, zap it
if [ -e ${OUTFILE} ]; then
    rm ${OUTFILE}
fi
# grab the interesting bits from the RDF file
for K in name homepageURL
do
   # nice fix-up, eh?
    grep "NS1:${K}" ${INFILE}  | sed -e "s/NS1:${K}=//" \
            -e 's/"//g' -e 's/>//' \
            -e 's/^[ \t]*//' | sort | uniq > ${OF}-${K}
    # using wc here is entirely optional
    wc -l ${OF}-${K}
    # append
    cat ${OF}-${K} >> ${OUTFILE}
done