Posts Tagged ‘imagemagick’

High resolution images from rijksmuseum

Monday, June 3rd, 2013

Here’s a php script to download and stitch together high resolution images from the rijksmuseum:


function prepareJSON($input) {
    //This will convert ASCII/ISO-8859-1 to UTF-8.
    //Be careful with the third parameter (encoding detect list), because
    //if set wrong, some input encodings will get garbled (including UTF-8!)
    $imput = mb_convert_encoding($input, 'UTF-8', 'ASCII,UTF-8,ISO-8859-1');
    //Remove UTF-8 BOM if present, json_decode() does not like it.
    if(substr($input, 0, 3) == pack("CCC", 0xEF, 0xBB, 0xBF)) $input = substr($input, 3);
    return $input;

$url = $argv[1];
$url = preg_replace("/^https/","http",$url);

echo "Getting title...";
  $contents = file_get_contents($url);
  preg_match('/objectNumber : "([^"]*)"/',$contents,$matches);
  $id = $matches[1];
  preg_match('/objectTitle : "([^"]*)"/',$contents,$matches);
  $offset = preg_replace("/^.*,([0-9]*)$/","\\1",$url);
  # extract id
  $id = preg_replace("/^.*\//","",$url);
  $id = preg_replace("/,.*$/","",$id);
  $title_url = preg_replace("/search\/objecten\?/",
  $title_url = preg_replace("/#\//", "&objectNumber=",$title_url);
  $title_url = preg_replace("/,[0-9]*$/", "",$title_url);
  $contents = file_get_contents($title_url);
  #$contents = file_get_contents("objecten.js");
  $items = json_decode(prepareJSON($contents), true);
  $title = $items["setItems"][0]["ObjectTitle"];
  $title = preg_replace("/^.*f.principalMaker.sort=([^#]*)#.*$/","\\1",$url).
$title = html_entity_decode($matches[1], ENT_COMPAT, 'utf-8');
$title = iconv("utf-8","ascii//TRANSLIT",$title);
$title = preg_replace("/[^A-z0-9]+/","-",$title);
$final = strtolower($title);
echo "\n";

echo "Getting images...";
$contents = file_get_contents(
#$contents = file_get_contents("levels.js");

$levels = json_decode(prepareJSON($contents), true);
$levels = $levels{"levels"};

foreach( $levels as $level)
  if($level{"name"} == "z0")
    $tiles = $level{"tiles"};
    // Obtain a list of columns
    foreach ($tiles as $key => $row) {
      $xs[$key]  = $row['x'];
      $ys[$key] =  $row['y'];

    // Sort the data with volume descending, edition ascending
    // Add $data as the last parameter, to sort by the common key
    array_multisort($ys, SORT_ASC, $xs, SORT_ASC, $tiles);

    $tile_x = 0;
    $tile_y = 0;
    foreach( $tiles as $tile)
      $x = $tile{"x"};
      $y = $tile{"y"};
      $tile_x = max($tile_x,intval($x)+1);
      $tile_y = max($tile_y,intval($y)+1);
      $img = "z0-$x-$y.jpg";
      $url = $tile{"url"};
      echo "(".$x.",".$y.") ";
      file_put_contents($img, file_get_contents($url));
      $list .= " ".$img;
echo "\n";
echo "Composing images...";
`montage $list -tile ${tile_x}x${tile_y} -geometry +0+0 -quality 100 $final.jpg`;
echo "\n";
echo $final.".jpg\n";

echo "Clean up...";
`rm -f $list`;
echo "\n";

Then you can call the script from the command line with something like:

php rijksmuseum.php ""

Buried inside of that script is also a nice way to clean up strings for use as filenames:

$title = html_entity_decode($matches[1], ENT_COMPAT, 'utf-8');
$title = iconv("utf-8","ascii//TRANSLIT",$title);
$title = preg_replace("/[^A-z0-9]+/","-",$title);
$final = strtolower($title);

Composite thumbnails of multipage pdf into stacked image

Monday, April 22nd, 2013

Here’s a bash script that takes a multipage pdf and produces a stack of thumbnails with nice shadows: multipage pdf thumbnail stack Save this in #!/bin/bash

# montage -gravity center null: null: 'supplemental_opt.pdf' null: null:
# -thumbnail 128x128 -sharpen 10 -bordercolor white -border 0 -background none
# +polaroid -set label '' -background Transparent -tile x1 -geometry -0+64
# -reverse -flop png:- | convert png:- -flop -trim output.png
if [ $# -lt 2 ]
  echo "Usage:"
  echo "  ./multipagethumb input.pdf output.png"
  exit 1

## this occassionally gives a concatentation of number of pages number of pages
## times: 10101010101010101010
#n=`identify -format %n $1`
n=`pdftk $1 dump_data | grep NumberOfPages | sed 's/[^0-9]*//'`

# 88+12+30*16 = 580
for p in $(seq 1 $n)
  p=`echo "$p-1"|bc`
  echo "convert $1[$p] -flatten -thumbnail ${w}x -bordercolor none -border 0 \( +clone \
    -background none -shadow 80x3+2+2 \) +swap -background none -layers \
    merge +repage  $output-$p.png"
  convert $1[$p] -flatten -thumbnail ${w}x -bordercolor none -border 0 \( +clone \
    -background none -shadow 80x3+2+2 \) +swap -background none -layers \
    merge +repage  $output-$p.png
  if [[ $p == "0" ]]
    echo "convert $output-$p.png $2"
    convert $output-$p.png $2
    echo "convert $output.png -gravity SouthEast -background none -splice ${x}x${y} $output.png"
    convert $output.png -gravity SouthEast -background none -splice ${x}x${y} $output.png
    echo "composite -compose dst-over $output-$p.png $output.png -gravity SouthEast $output.png"
    composite -compose dst-over $output-$p.png $output.png -gravity SouthEast $output.png
  rm $output-$p.png

Then issue:

./ input.pdf output.png

Note: You can achieve something similar with the montage and +polaroid command but it was difficult to achieve diagonal stacking and the correct order.

Round images to width and heights divisible by 2, by cropping

Saturday, November 10th, 2012

To make an h264 movie from a bunch of images using ffmpeg I need all the images to have size dimensions divisible by two. Here’s a little one-liner to crop a bunch of images to the nearest size divisible by 2:

for file in *.png; do convert -crop `identify -format "(%[fx:w]/2)*2" $file | bc`x`identify -format "(%[fx:h]/2)*2" $file | bc`+0+0 $file cropped_$file; done

I suppose using mogrify would potentially be faster. But I’m not sure how to introduce the rounding.

imagemagick animated gif layers showing through transparency

Tuesday, May 1st, 2012

Today I finally got around to supporting screen dumps from my opengl apps with transparency. I’ve been wanting to do this for a while, so that I can easily make animated gifs that can be overlaid on a background image/video. I ran into a weird problem using imagemagick’s convert tool. I was dumping every frame to a file called: screencapture-01.tga, screencapture-02.tga, … and then calling:

convert screencapture-*.tga screencapture.gif
 This made an animated gif with transparency, but each frame showed the previous frames behind it. Resulting in something like this: 

worm animation with wrong transparency The magic keyword seems to be “dispose” and calling the following fixed my problem:

convert -dispose 2 screencapture-*.tga screencapture.gif
 which results in: 

worm animation with correct transparency Then I can underlay a background image and get something like: worm animation with correct transparency over clouds Update: To automatically trim the image in the same command use:

convert -dispose 2 screencapture-*.tga -coalesce -repage 0x0 -trim +repage screencapture.gif
convert -dispose 2 screencapture-00*.tga -coalesce -trim -layers TrimBounds screencapture.gif

Resizing animated gifs scaling issue

Monday, February 27th, 2012

When trying to resize animated gifs into thumbnails using imagemagick’s convert, I noticed that with certain gifs the first frame of the animation would resize correctly but the subsequent frames would remain unscaled or only partially scaled. The command I was using was:

convert input.gif -thumbnail x200 -resize '200x<' -resize 50% -gravity center -crop 100x100+0+0 +repage output.gif

For non-animated gifs, this correctly makes a 100 by 100 thumbnail. For animated gifs where each frame is sufficiently different this creates a animated thumbnail. But for animations whose frames differ only slightly it seems to resize each "difference frame" which may not have the same bounding box. To correct this just add the command "-coalesce" to the beginning of the imagemagick command sequence. Like this:

convert input.gif -coalesce -thumbnail x200 -resize '200x<' -resize 50% -gravity center -crop 100x100+0+0 +repage output.gif

TexMapPreview: simple texture mapping utility

Wednesday, September 21st, 2011

texmappreview simple texture mapping utility working on woody I posted the source and binary of TexMapPreview. It’s a little texture mapping utility I’ve been using to visualize texture maps on the meshes I deform. It takes as input a mesh (with texture coordinates) and an (texture) image. Then it can either write the visualization of the texture mapped mesh to an output file or display it in a GLUT window. Glut is the only dependency. TexMapPreview itself can only read and write .tga image files. But, I include a bash script wrapper which uses ImageMagick’s convert tool to enable reading and writing of all sorts of file formats (.png, .jpg, .tiff, whatever convert can read/write).

We thank Scott Schaefer for providing the wooden gingerbread man image from “Image Deformation Using Moving Least Squares”.

Convert video (.mp4 or other) to high quality animated gif

Monday, September 12th, 2011

You can do this directly with ffmpeg, but I had trouble with it and seem to remember the quality not being so good. Instead I convert my .mp4 video into an animated gif by first grabbing 10 frames every second and saving them to files using the following:

ffmpeg -i input.mp4 -r 10 output%05d.png

Then using imagemagick’s convert tool insert each frame into an animated gif:

convert output*.png output.gif

This makes a rather large .gif file for even modest videos so probably you’ll want to post process it (e.g. with Photoshop) to reduce the file size by compressing the .gif file or reducing the dimensions.

Clean up the png files using:

rm output*.png

Extract all images from a pdf as png files (at full resolution)

Friday, August 26th, 2011

Here’s a two-liner to extract all the embedded color images in a pdf and convert then to png files. pdfimages extracts the images as ppm files. But I couldn’t open these immediately on my mac with my favorite image editing tools, so I convert them with mogrify from the imagemagick suite to png files.

pdfimages original.pdf ./extracted-images
mogrify -format png ./extracted-images*.ppm

and to get rid of the ppm files

rm ./extracted-images*.ppm

Get largest image from webpage using php, wget and imagemagick

Thursday, March 10th, 2011

Here’s a script I wrote to make autoblog a little more interesting. Now any time a spammer makes a comment it checks the URL they provide for a large image and appends it to their comment when their comment becomes a post.

I use a little php script to get the largest image from the url and return an image tag (as long as the image is big enough). Here it is:


function isValidURL($url)
  return preg_match('|^http(s)?://[a-z0-9-]+(.[a-z0-9-]+)*(:[0-9]+)?(/.*)?$|i', $url);

function image_from_URL($URL)
  $tempdir = "temp_images";
    `wget -r -l1 -H -t1 -nd -N -np -A jpg,gif,png -P $tempdir -erobots=off $URL`;
    $handle = opendir($tempdir); 
    $max_size = 0;
    $biggest = "";
    while (false !== ($file = readdir($handle)))
      $extension = strtolower(substr(strrchr($file, '.'), 1)); 
      if($extension == 'jpg' || $extension == 'gif' || $extension == 'png')
        // identify from imagemagick can return the w and h as a string "w*h"
        // which then bc can compute as a multiplication giving the area in pixels
        $size = (int) exec(
          "identify -format \"%[fx:w]*%[fx:h]\" \"$tempdir/$file\" | bc", $ret);
        if($size > $max_size)
          $max_size = $size;
          $biggest = $file;


    if($max_size >= 80000)
      return "<img src='$tempdir/$biggest' class='center'>";
  return "";

isvalidurl source
wget one-liner source

Resize image pixel by pixel without antialiasing (raw scale)

Monday, October 12th, 2009

Using image magick’s convert util, you can resize an image without any blurring or fuzziness by issuing the command with these options (this particularly resizes height to 1920 pixels):

convert input.png -filter Point -resize x1920 +antialias output.png

I resized this:

before resize


before resize

Update: As Pedrow points out, this is even easier:

convert input.png -scale x1920 output.png