## Archive for May, 2016

### A simple, cheapskate super resolution implementation in pure matlab

Thursday, May 26th, 2016

As a proof of concept, I implemented a very simple image upsampling/super resolution algorithm. This method uses local self-examples: it replaces each pixel with a 2×2 block of pixels interpolated between blocks from the same image. The blocks are identified based on matching the local regions around.

If the window radius =1, then I first create a point in Wij in R9 for every pixel (i,j) so that W22 = [I11,I12,I13,I21,I22,I23,I31,I32,I33], where Iij is the color at pixel ij.

Then for every 2×2 block Dij centered at the bottom right corner of pixel (i,j) I create a point in R9, so that D22 = [0.25(I00 + I01 + I10 + I11), 0.25(I02 + I03 + I12 + I13), … , 0.25*(I44 + I45 + I54 + I55)]. So the points Dij represent downsampled blocks.

The above, treats all coordinates (neighboring pixel values) equally. I found that it’s best to weight the influence by a Gaussian so that the center pixel/block is worth more.

Then for each Wij I identify the closest K Dij using knnsearch. I replace the pixel (i,j) with an interpolation of the K blocks. This is done using Shepard interpolation with distances measured in the weighted R9 space.

Flipping around through the literature, this seems in spirit with 2000-era Freeman-type super-resolution algorithms. And the results are comparable. But by no means are these as good as state of the art heavy-learning based techniques around today.

Find the code for imupsample in gptoolbox.

Here’re some examples:

And here’s a bigger example:

There’s still a bathroom window effect happening, but with more local samples to choose from, up sampling bigger images seems to work better.

### Trouble building opencv_contrib extras, can’t find unsupported/Eigen

Monday, May 23rd, 2016

I ran into a problem compiling the opencv_contrib modules.

Just running a vanilla cmake then make:

cmake -DOPENCV_EXTRA_MODULES_PATH=../opencv_contrib/modules ..
make


produced the error:

/Users/ajx/Downloads/opencv/opencv_contrib/modules/rgbd/src/odometry.cpp:41:10: fatal error: 'unsupported/Eigen/MatrixFunctions' file not found
#include <unsupported/Eigen/MatrixFunctions>


It seems the problem is that cmake was finding Eigen in /Library/Frameworks/Eigen.Framework/. I don’t even know how that Eigen got installed. Homebrew? Does it ship with Mac OS X now? In any case, I fixed the issue by pointing cmake directly to my homebrew’s Eigen installation:

cmake -DOPENCV_EXTRA_MODULES_PATH=../opencv_contrib/modules -DEIGEN_INCLUDE_PATH=/usr/local/include/eigen3/ ..


### Adding zero to sparse matrix in matlab results in dense matrix

Friday, May 20th, 2016

Here’s a little gotcha I stepped into today:

issparse(speye(20000)+0)


returns false (and therefore takes a long time). This is consistent in the sense that sparse + dense in matlab returns dense, but definitely goes against my expectation in this specific case.

### Apache rewrite rule to send all urls from site A to analogous url on siteB

Thursday, May 19th, 2016

Here’s the rewrite rule I’m putting on my cs.columbia.edu site’s .htaccess file to redirect everything to my new cs.toronto.edu site:

RewriteEngine On
RewriteCond %{HTTP_HOST} ^www.cs.columbia.edu$[NC] RewriteRule ^(.*)$ http://www.cs.toronto.edu/~jacobson/\$1 [R=301,L]


Now, not only does http://www.cs.columbia.edu/~jacobson redirect to http://www.cs.toronto.edu/~jacobson/, but so do more complicated urls: http://www.cs.columbia.edu/~jacobson/images/alec-jacobson.jpg –> http://www.cs.toronto.edu/~jacobson/images/alec-jacobson.jpg

This also works for <img src= tags:

Debugging .htaccess commands is made especially tedious because browsers like Google Chrome and Safari cache old .htaccess files. So it’s easy to be tricked into thinking your rewrite rules aren’t working. Remember to clear your cache between attempts (or use incognito windows each time).

### Linker error on freshly brewed python install

Wednesday, May 18th, 2016
from PIL import Image
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/site-packages/PIL/Image.py", line 119, in <module>
import io
File "/usr/local/Cellar/python/2.7.11/Frameworks/Python.framework/Versions/2.7/lib/python2.7/io.py", line 51, in <module>
import _io
Expected in: flat namespace


Apparently this is happening because bash was still confused about which python to use after brew install python. I issued:

hash -r python


to fix the problem. But just using a new shell would also work.

### Unknown locale error on freshly brewed python

Wednesday, May 18th, 2016
    import matplotlib.path
File "/usr/local/lib/python2.7/site-packages/matplotlib/__init__.py", line 1131, in <module>
rcParams = rc_params()
File "/usr/local/lib/python2.7/site-packages/matplotlib/__init__.py", line 975, in rc_params
return rc_params_from_file(fname, fail_on_error)
File "/usr/local/lib/python2.7/site-packages/matplotlib/__init__.py", line 1100, in rc_params_from_file
config_from_file = _rc_params_in_file(fname, fail_on_error)
File "/usr/local/lib/python2.7/site-packages/matplotlib/__init__.py", line 1018, in _rc_params_in_file
with _open_file_or_url(fname) as fd:
File "/usr/local/Cellar/python/2.7.11/Frameworks/Python.framework/Versions/2.7/lib/python2.7/contextlib.py", line 17, in __enter__
return self.gen.next()
File "/usr/local/lib/python2.7/site-packages/matplotlib/__init__.py", line 1000, in _open_file_or_url
encoding = locale.getdefaultlocale()[1]
File "/usr/local/Cellar/python/2.7.11/Frameworks/Python.framework/Versions/2.7/lib/python2.7/locale.py", line 543, in getdefaultlocale
return _parse_localename(localename)
File "/usr/local/Cellar/python/2.7.11/Frameworks/Python.framework/Versions/2.7/lib/python2.7/locale.py", line 475, in _parse_localename
raise ValueError, 'unknown locale: %s' % localename
ValueError: unknown locale: UTF-8


export LC_ALL=en_US.UTF-8
export LANG=en_US.UTF-8


### Thingi10K: A Dataset of 10,000 3D-Printing Models

Tuesday, May 17th, 2016

Qingnan “James” Zhou and I have released a technical report detailing the contents and methodology behind our ten thousand model thingi10k dataset.

Abstract
Empirically validating new 3D-printing related algorithms and implementations requires testing data representative of inputs encountered in the wild. An ideal benchmarking dataset should not only draw from the same distribution of shapes people print in terms of class (e.g., toys, mechanisms, jewelry), representation type (e.g., triangle soup meshes) and complexity (e.g., number of facets), but should also capture problems and artifacts endemic to 3D printing models (e.g., self-intersections, non-manifoldness). We observe that the contextual and geometric characteristics of 3D printing models differ significantly from those used for computer graphics applications, not to mention standard models (e.g., Stanford bunny, Armadillo, Fertility). We present a new dataset of 10,000 models collected from an online 3D printing model-sharing database. Via analysis of both geometric (e.g., triangle aspect ratios, manifoldness) and contextual (e.g., licenses, tags, classes) characteristics, we demonstrate that this dataset represents a more concise summary of real-world models used for 3D printing compared to existing datasets. To facilitate future research endeavors, we also present an online query interface to select subsets of the dataset according to project-specific characteristics. The complete dataset and per-model statistical data are freely available to the public.

### Matlab imresize with bilinear methods computes different result than bilinear interpolation

Thursday, May 12th, 2016

Strangely, it seems that matlab’s builtin function imresize does not reproduce the usual bilinear interpolation.

Let’s consider a toy 4×4 image:

F = matrixnormalize(magic(4));


BF = imresize(F,[256 256],'bilinear');


Compare that to explicitly computing the bilinear interpolation:

[X,Y] = meshgrid(1:size(F,2),1:size(F,1));
[BX,BY] = meshgrid(linspace(1,size(F,2),size(BF,2)),linspace(1,size(F,1),size(BF,1)));
BF2 = interp2(X,Y,F,BX,BY,'bilinear');


The difference is obvious in the corners and along the edges. The imresize result has some flat (piecewise constant) patches). Why? It actually looks as if the image was padded by repeated values before upsampling.

### Rig Animation with a Tangible and Modular Input Device preprint + video

Thursday, May 5th, 2016

We’ve put up a project page and a preprint of our new SIGGRAPH 2016 paper “Rig Animation with a Tangible and Modular Input Device”, joint work with Glauser Oliver, Wan-Chun Ma, Daniele Panozzo, O. Hilliges, O. Sorkine-Hornung.

This is not just version 2.0 of our tangible and modular input device from 2014 (although the new hardware is totally awesome). In this paper we also present a new optimization for mapping joints and splitters to any industry-grade character rig. The optimization will output instructions for a device to construct out of parts and then map those degrees of freedom to all parameters of the rig.

Abstract
We propose a novel approach to digital character animation, combining the benefits of tangible input devices and sophisticated rig animation algorithms. A symbiotic software and hardware approach facilitates the animation process for novice and expert users alike. We overcome limitations inherent to all previous tangible devices by allowing users to directly control complex rigs using only a small set (5-10) of physical controls. This avoids oversimplification of the pose space and excessively bulky device configurations. Our algorithm derives a small device configuration from complex character rigs, often containing hundreds of degrees of freedom, and a set of sparse sample poses. Importantly, only the most influential degrees of freedom are controlled directly, yet detailed motion is preserved based on a pose interpolation technique. We designed a modular collection of joints and splitters, which can be assembled to represent a wide variety of skeletons. Each joint piece combines a universal joint and two twisting elements, allowing to accurately sense its configuration. The mechanical design provides a smooth inverse kinematics-like user experience and is not prone to gimbal locking. We integrate our method with the professional 3D software Autodesk Maya® and discuss a variety of results created with characters available online. Comparative user experiments show significant improvements over the closest state-of-the-art in terms of accuracy and time in a keyframe posing task.