Archive for August, 2013

Delta Blues playlist for mlmp3p

Friday, August 30th, 2013

Here’s my queue-up for a blues playlist using mlmp3p:

:r/(mississippi john hurt|robert johnson|junior kimbrough|blind willie|champion jack dupree)/artist

Parsing optional input parameters/arguments in MATLAB function

Thursday, August 29th, 2013

Here’s some boilerplate code I use for parsing additional, optional input parameters when I write a matlab function. Usual I structure my matlab function prototypes as follows:

function C = function_name(A,B,varargin)

I comment this prototype with a message that’s very useful when issuing help function_name:

% FUNCTION_NAME This is a high-level description of what function_name takes
% as input, what it does and what it produces as output
%
% C = function_name(A,B)
% C = function_name(A,B,'ParameterName',ParameterValue)
%
% Inputs:
%   A  num_cols by num_rows matrix of blah blah blah
%   B  num_cols by num_rows matrix of blah blah blah
%   Optional:
%     'ParameterName1' followed by an integer blah blah {1}
%     'ParameterName2' followed by one of 'foo','bar',{'oof'} blah blah
% Output:
%   C  num_cols by num_rows matrix of blah blah blah
%

I parse the optional inputs with a while loop and switch like this:

% defaults for optional parameters
parameter_value1 = 1;
parameter_value2 = 'oof';

% parse optional input parameters
v = 1;
while v < numel(varargin)
switch varargin{v}
case 'ParameterName1'
assert(v+1<=numel(varargin));
v = v+1;
parameter_value1 = varargin{v};
case 'ParameterName2'
assert(v+1<=numel(varargin));
v = v+1;
parameter_value2 = varargin{v};
otherwise
error('Unsupported parameter: %s',varargin{v});
end
v = v+1;
end

Update: I should probably be using matlab’s built-in inputParser class, but it means that parameter names have to match variable names in my code. And variables are stored as fields in the inputParse instance: e.g. parser.Results.variable_name1. The advantage though is that it can easily handle input type validation.

Here’s an updated custom optional input parser. So far without validation, though it seems clear how to add support via function handles, e.g. @isnumeric, @islogical, @() customthing... .

% default values
variable_name1 = false;
variable_name2 = [1,2,3];
% Map of parameter names to variable names
params_to_variables = containers.Map( ...
{'ParamName1','ParamName2'}, ...
{'variable_name1','variable_name2'});
v = 1;
while v <= numel(varargin)
param_name = varargin{v};
if isKey(params_to_variables,param_name)
assert(v+1<=numel(varargin));
v = v+1;
% Trick: use feval on anonymous function to use assignin to this workspace
feval(@()assignin('caller',params_to_variables(param_name),varargin{v}));
else
error('Unsupported parameter: %s',varargin{v});
end
v=v+1;
end

Zip up matlab demo and dependencies

Tuesday, August 27th, 2013

I use this all the time to zip up little matlab demos. Let mydemo.m be the name of your demo script:

>> name_of_script = 'mydemo';
>> C = depends(name_of_script);
>> C = C(cellfun(@isempty,strfind(C,'/local/mosek')));
>> zip([name_of_script '.zip'],C);
>> fprintf('This package should contain\n%s/\n',name_of_script);
>> N = regexprep(C,'^.*\/','');
>> fprintf('  %s\n',N{:});

I paste this in the comments at the top of my script so that I can easy copy paste it to create a new zip. Notice that I’m removing mosek dependencies (you could replace this with any other large dependencies you don’t want to include). And I can explicitly add other non-dependency files like a README or perhaps some data (e.g. mymesh.obj).

Arbitrarily preprocess a .tex file

Tuesday, August 27th, 2013

Here’s a rather magical two lines you can place at the top of your (single-file) latex file to preprocess it with and arbitrary stdin-stdout script (call your script foo.sh):

\input{|"cat \jobname.tex| sed -n '/^__END__/,$p' | sed '1 d' | ./foo.sh"} __END__ So for example let’s say your tex file is: \input{|"cat \jobname.tex| sed -n '/^__END__/,$ p' | sed '1 d' |
sed 's/[Cc]ensor[A-z]*/XXXX/g'"}
__END__
\documentclass{article}
\begin{document}
\section{Censorship}
Censor words starting with censor.
\end{document}

Then the output will be

Note: If you’re using pdflatex you need to enable shell scripts by adding the --shell-escape argument.

Animated GIF of vector field coordinates in matlab

Monday, August 26th, 2013

Suppose you have a mesh (V,F) and a vector field (or matrix of scalar field values) W. You can create a looping animated gif in matlab of a visualization of those values using:

t = trisurf(F,V(:,1),V(:,2),0*V(:,1),'EdgeColor','none','FaceColor','interp','FaceLighting','phong');
hold on;
% for me each coordinate corresponds to a point in C
scatter(C(:,1),C(:,2),'ok','MarkerFaceColor','y','LineWidth',3,'SizeData',100);
hold off;
axis equal;
view(2);
title('Vector Field  ','FontSize',20);
for w = 1:size(W,2)
set(t,'CData',W(:,w));
drawnow;
im = myaa('raw');
[imind,cm] = rgb2ind(im,256);
filename = 'vector-field.gif';
if w == 1;
imwrite(imind,cm,filename,'gif', 'Loopcount',inf);
else
imwrite(imind,cm,filename,'gif','WriteMode','append');
end
end

This produces something like:

Source

Compile SVR on mac os x

Saturday, August 24th, 2013

I managed to get the PLC meshing program SVR (an alternative to tetgen). It was a little difficult to get compiled on Mac OS X so here are my notes.

The first weird thing is that the make configuration for SVR tries to download and compile the boost libraries. To get rid of this and just use boost as installed from macports I commented out the following line in boost/Makefile.in from:

all: stamp-boost

to

all:

Then I ran:

./configure --prefix=/usr/local/svr CPPFLAGS='-Wno-deprecated -fno-strict-aliasing'

Before making there are a few files that need to be altered. In utilities/details/hash.h add the following include:

#include <boost/cstdint.hpp>

In feature-refine/GenericMesh.h, feature-refine/RefineVertex.h, and utilities/FastHash.h remove the following

: public boost::noncopyable

In feature-refine/SVR.cpp, front-end/SVR.cpp, and point-refine/svr.cpp add the following after the definition of valgrind_hack:

((void)valgrind_hack);

Postprocess bad transparency pdfs from pdflatex using inkscape

Friday, August 23rd, 2013

I’ve recently renewed my frustration with pdflatex when trying to include pdfs I’ve created in Illustrator with transparency. I finally have a “solution”. Save the following in a file called fixpdftransp.sh:

#!/bin/bash
if [ $# -eq 0 ] then echo "Usage:" echo " fixpdftransp input.pdf output.pdf" return 1 fi # This is known to work with: # Included pdfs of compatibility >1.4 # Inkscape version 0.48.2 r9819 (Jul 15 2011) # This is known not to work with: # Included pdfs of compatibility 1.3 # Inkscape 0.48.3.1 r9886 (Aug 23 2013) INKSCAPE="/Applications/Inkscape.app/Contents/Resources/bin/inkscape" FORMATIN=".fixpdftransp-%09d.pdf" FORMATOUT=".fixpdftransp-ink-%09d.pdf" NUM_PAGES=pdfinfo$1 | grep Pages: | sed -e "s/ *Pages: *//g"
LIST=""
for i in $(seq 1${NUM_PAGES})
do
PAGE_NAME_IN=printf $FORMATIN$i
pdftk $1 cat$i output $PAGE_NAME_IN echo "Creating$PAGE_NAME_IN"
PAGE_NAME_OUT=printf $FORMATOUT$i
echo "$INKSCAPE$PAGE_NAME_IN --export-pdf=$PAGE_NAME_OUT"$INKSCAPE $PAGE_NAME_IN --export-pdf=$PAGE_NAME_OUT
LIST="$LIST$PAGE_NAME_OUT"
done
pdftk $LIST cat output$2

Notice you need a specific version of inkscape. I’m using whatever you download from their site today. Notably, this does not work with the inkscape from macports.

Save your included pdfs as PDF 1.4 or higher. Use pdflatex to create a bad pdf and run this on your bad pdf:

Update: Though this works on simple pdfs, inkscape seems to screw with the fonts and other images a lot. So, unfortunately resulting pdf is probably not usable.

Workaround for broken pdftk burst

Friday, August 23rd, 2013

The macports version of pdftk seems to be broken when using the burst command:

pdftk input.pdf burst output page-%02d.pdf

Is supposed to split apart a multipage pdf into single pages but instead produces a Java runtime exception:

Unhandled Java Exception:
java.lang.NullPointerException
at com.lowagie.text.pdf.PdfCopy.copyIndirect(pdftk)
at com.lowagie.text.pdf.PdfCopy.copyObject(pdftk)
at com.lowagie.text.pdf.PdfCopy.copyDictionary(pdftk)

To get around this I wrote a short script. Save the following in a file called pdftkburst.sh:

#!/bin/bash
if [ $# -eq 0 ] then echo "Usage:" echo " pdftkburst input.pdf output-%04d.pdf" return 1 fi NUM_PAGES=pdfinfo$1 | grep Pages: | sed -e "s/ *Pages: *//g"
#echo "NUM_PAGES: $NUM_PAGES." for i in$(seq 1 ${NUM_PAGES}) do #echo "printf \"$2\" $i" PAGE_NAME=printf "$2" $i pdftk$1 cat $i output$PAGE_NAME
#echo "Creating \$PAGE_NAME"
done

Then you can burst your pdfs using:

./pdftkburst.sh input.pdf page-%02d.pdf

Which indeed creates page-01.pdf, pages-02.pdf etc.

Transparent gradients in included pdfs using pdflatex lost by Preview.app

Thursday, August 22nd, 2013

I’m running into this issue again. pdflatex on mac os x installed using macports via texlive somehow garbles pdfs with transparent gradients so that they don’t view correctly with Preview.app. The original PDF looks correct with my Preview.app:

Try compiling a small test document:

\documentclass{article}
\pdfpageattr {/Group << /S /Transparency /I true /CS /DeviceRGB>>}
\usepackage{graphicx}
\begin{document}
\begin{figure}
\includegraphics[width=\linewidth]{white2green.jpg}
\includegraphics[width=\linewidth]{{{white2green1.7}}}
\includegraphics[width=\linewidth]{{{white2green1.3}}}
\end{figure}
\end{document}

For me the middle image doesn’t show up correctly. Of course the jpg does, but it’s pure raster. The last image is a pdf, but it’s a hacky solution. If you save your pdf with linear gradients in Illustrator as Compatibility: Acrobat 4 (PDF 1.3) then Illustrator will raster your transparency. It does a halfway decent job and it seems that the pdf is forward compatible: you can still edit it in Illustrator as vector graphics.

atan2 is harmonic

Thursday, August 22nd, 2013

Don’t you forget it. For the lazy among us, here’s a maple proof:

simplify(diff(arctan(y,x),x,x)+diff(arctan(y,x),y,y),size);

produces:

0