Alle Beiträge von Oluf Lorenzen

Burp on FreeNAS/FreeBSD

# install bash for the linux-experience (and some scripts)
pkg install bash libiconv coreutils

# download burp (fix the version!), extract
curl "http://softlayer-ams.dl.sourceforge.net/project/burp/burp-1.4.34/burp-1.4.34.tar.bz2" > burp-1.4.34.tar.bz2
file burp-1.4.34.tar.bz2 
tar xf burp-1.4.34.tar.bz2 
cd burp-1.4.34
# install dependencies
pkg install openssl librsync perl5 gmake
# configure and install
# add '--disable-ipv6' if you do not have ipv6 properly configured!
# -- https://github.com/grke/burp/issues/182
./configure --prefix=/usr/local --sbindir=/usr/local/sbin LDFLAGS="-L/usr/local/lib" --sysconfdir=/etc/burp
gmake
gmake install
# replace 'date' in timer_script, as freebsds' date does not work as linux'
sed -i ".bak" "s,date ,/usr/local/bin/gdate ,g" /etc/burp/timer_script

Eventually set ca_burp_ca = /usr/local/sbin/burp_ca in /etc/burp/burp-server.conf

# run burp-server in foreground to see what happens (CA is generated)
/usr/local/sbin/burp -c /etc/burp/burp-server.conf -F

Create a really simple startup-script /usr/local/etc/rc.d/burp

#!/bin/sh

/usr/local/sbin/burp -c /etc/burp/burp-server.conf &
exit 0

Updated: 2015-04-09: add date-replacement with sed

owncloud-woes

So you installed Owncloud and get a HTTP-500 error for some preview-image: /index.php/core/preview.png?x=36&y=36&file=%2F%2FownCloudUserManual.pdf&c=53b5540ae7d0e

Error in the admin-interface: Postscript delegate failed `/tmp/a1blah2c1774205d9ae418a572cdc38pdf': No such file or directory @ error/pdf.c/ReadPDFImage/677

Install ghostscript. W(

Prosody with authentification against LDAP/ActiveDirectory

I am using

  • Prosody v0.9.1
  • sasl2-bin v2.1.25
  • Debian 8/jessie

you need several packages:

apt-get update ; apt-get install sasl2-bin libsasl2-modules-ldap lua-ldap lua-cyrussasl

and configs:

/etc/default/saslauthd

START=yes
MECHANISMS="ldap"
MECH_OPTIONS="/etc/saslauthd.conf"

/etc/saslauthd.conf

ldap_servers: ldap://ldap.example.com/
ldap_search_base: ou=foo,dc=example,dc=com

ldap_bind_dn: ldap-user-for-binding
ldap_bind_pw: pw-for-that-user
ldap_use_sasl: no
ldap_start_tls: no
ldap_auth_method: bind

ldap_filter: (sAMAccountName=%u)

/etc/prosody/prosody.cfg.lua

authentication = "cyrus"
cyrus_service_name = "xmpp"

-- eventually configure SSL properly
ssl = {
        key = "x";
        certificate = "y";

        options = { "no_sslv2", "no_sslv3" , "no_ticket", "no_compression" };
        ciphers = "HIGH:!DSS:!aNULL@STRENGTH!:!DES-CBC3-SHA:!ECDHE-RSA-DES-CBC3-SHA:!EDH-RSA-DES-CBC3-SHA";
}

Add the system-user ‚prosody‘ to the ’sasl‘-group and restart both services:

adduser prosody sasl ; service saslauth restart ; service prosody restart

Eventually have a look at /var/log/auth.log for sasl-problems or the prosody-logs.

a tale of fail and win (image recovery/management under linux)

  1. use git-annex-assistant to create backups on several destinations
    1. use test-repo first, do some tests
    2. try on smaller directories w/ actually valuable data, create backups first
    3. annex-ize several other directories
    4. remove picture-backup from external HDD to make space for new backup via git annex (very bad ida)
    5. annex-ize several GB of pictures dating back to 2004 (RAWs and JPGs)
    6. fail somehow several times, remove .git directory start anew
    7. (do some other stuff)
    8. get back to the picture-dir, realize that it is empty (besides some folders), .git directory contains nothing
  2. use ntfsundelete, and some proprietary tools to recover (only marked as) deleted files from the ntfs volume (900 GB)
    • use git annex fsck on the recovered .git data, get only some pictures back, not very much (about 2k files)
  3. use photorec on several runs to recover .jpg and .cr2 (RAW) data
  4. try to use picasa on the files to get some sorting (and kick out unwanted data as images from games etc.)
    • picasa somehow mangles the raw-files :(
    • picasa does not properly use the exif-provided file-creation date, but a mixture of that and the files‘ date w(
  5. fiddle around with exiftool to get back the timestamp from the files‘ exif-data
    find . -type f -name "*.jpg" -exec exiftool  -FileModifyDate\<DateTimeOriginal {} \;
  6. try digikam
    1. somehow works
    2. slow on previews when using ‚import from files‘
    3. slow on DB handling
    4. hangs itself when moving about 6k (?) files from one folder to another
    5. switch to MySQL as backend
      • somehow fail, try google
      • realize that the internal MySQL server won’t do, install external one
      • use ’settings‘->’Database migration‘ before switching via the config
    6. speed is better
    7. use the duplicate detection to remove redundant files (takes time …)