## Posts Tagged ‘rendering’

### Paper-worthy rendering in MATLAB

Thursday, July 20th, 2017

MATLAB is not a great tool for creating 3D renderings. However, the learning curves for most commercial rendering tools are quite steep. Other tools like Mitsuba can create beautiful pictures, but can feel quite cumbersome for rendering pure geometry rather than the physical scenes their designed for.

Over the years, I’ve developed a way of creating plots of 3D shapes in MATLAB using a few extra functions in gptoolbox. This started as a way to just make images from research prototypes more palatable, but eventually became the usual way that I render images for papers. If the code for my research is already written in MATLAB then one huge advantage is that every image in my paper can have a *.m script that deterministically generates the result and the corresponding image with user intervention. This helps with reproducibility, editing and sharing between collaborators.

Here’s a “VFX Breakdown” of rendering a 3D shape in MATLAB.

t = tsurf(F,V);
set(gcf,'COlor',0.94*[1 1 1]);
teal = [144 216 196]/255;
pink = [254 194 194]/255;
bg_color = pink;
fg_color = teal;
for pass = 1:10
switch pass
case 1
% blank run
axis([-209.4       119.38      -181.24       262.67      -247.28 247.38]);
case 2
axis equal;
axis([-209.4       119.38      -181.24       262.67      -247.28 247.38]);
axis vis3d;
case 3
t.EdgeColor = 'none';
case 4
set(t,fphong,'FaceVertexCData',repmat(fg_color,size(V,1),1));
case 5
set(t,fsoft);
case 6
l = light('Position',[0.2 -0.2 1]);
case 7
set(gca,'Visible','off');
case 8
set(gcf,'Color',bg_color);
case 9
case 10
end

vidObj = VideoWriter(sprintf('nefertiti-%02d.mp4',pass),'MPEG-4');
vidObj.Quality = 100;
vidObj.open;
thetas = linspace(30,-30,450);
for theta = thetas(1:end-1)
view(theta,30);
drawnow;
vidObj.writeVideo(getframe(gcf));
end
vidObj.close;

end


Wednesday, September 9th, 2015

Here’s a little script demonstrating some of the fancy 3d-model rendering features you can squeeze out of MATLAB with a little effort. Stroking sharp edges and silhouettes and giving just the hint of transparency, you can achieve a CAD-style effect.

[V,F] = load_mesh('~/Dropbox/models/fandisk.off');
V = V*axisangle2matrix([1 0 0],pi);
N = normals(V,F);
BC = barycenter(V,F);

% sharp edges
A(1&A) = abs(A(1&A)-pi)>pi*0.11;
[CI,~,CV] = find(C.*A);
E = F([CI+mod(CV,3)*size(F,1) CI+mod(CV+1,3)*size(F,1)]);

% cut mesh at sharp edges to get crisp normals
[G,I] = cut_edges(F,E);
W = V(I,:);

% floor board
BB = bounding_box(V(:,1:2));
BB = bsxfun(@plus,bsxfun(@minus,BB,mean(BB))*2,mean(BB));
BB(:,3) = min(V(:,3))-4e-3;
BB = reshape(permute(BB,[1 3 2]),[2 2 3]);
% checkboard texture
ch = repmat(1-0.2*xor((mod(repmat(0:128-1,128,1),8*2)>7), ...
(mod(repmat((0:128-1)',1,128),8*2)>7)),[1 1 3])*0.5 + 0.5;

clf;
hold on;
blue = [0.2 0.3 0.8];
tf = tsurf(G,W, ...
'FaceVertexCData',repmat(blue,size(W,1),1), ...
'SpecularStrength',0, ...
'DiffuseStrength',0.1, ...
'AmbientStrength',1.0, ...
'EdgeColor','none','FaceAlpha',0.9,fphong);
te = tsurf(E(:,[1 2 2]),V,'EdgeColor',blue*0.75);
to = tsurf([1 1 1],V,'LineWidth',2,'EdgeColor',blue*0.5);
sc = surf(BB(:,:,1),BB(:,:,2),BB(:,:,3), ...
'CData',ch,'FaceColor','texturemap', ...
'SpecularStrength',0, 'DiffuseStrength',0, 'AmbientStrength',1);
view(130,38);
axis equal;
l = light('Position',[1 4 5.0],'Style','infinite');
% faint amient occlusion
AO = ambient_occlusion(W,G,W,per_vertex_normals(W,G),1000);
AO = AO*0.17;
tf.FaceVertexCData = bsxfun(@times,tf.FaceVertexCData,1-AO);
hold off;
axis vis3d;
camproj('persp');
set(gca,'Visible','off');
T = get(gca,'tightinset');
set(gca,'position',[T(1) T(2) 1-T(1)-T(3) 1-T(2)-T(4)]);

% Set up rotatation callbacks to hide view-dependent effects during drag
up = @() ...
set(to,'Faces', ...
outline(F((sum(N.*bsxfun(@minus,BC,campos),2)<=0),:))*[1 0 0;0 1 1]) | ...
set(h,'FaceAlpha',0.5*(g*[campos 1]'<0)) | ...
set(sc,'FaceAlpha',1.0*(g*[campos 1]'<0));
up();
down = @() set(to,'Faces',[]);
set(rotate3d,'ActionPostCallback',@(src,obj) up());
set(rotate3d,'ActionPreCallback',@(src,obj) down());

for t = linspace(0,-360,60)
view(64+t,20);
up();
drawnow;
end


Update: It even looks reasonable for less artificial shapes, though perhaps the hard edges just accidentally look fuzzy like fur:

### Two-sided material in matlab

Monday, March 23rd, 2015

Unfortunately there seems to be no builtin support for two-sided surfaces in matlab. There’s some rudimentary control over back-face lighting, but that’s all. At least you can determine the back-facing triangles for a given camera position:

N = normals(V,F);
BC = barycenter(V,F);
back_facing = sum(N.*bsxfun(@minus,BC,campos),2)<=0;


Here’s an example for an armadillo mesh:

t = tsurf(F,V,'EdgeColor','none','FaceLighting','phong');view(2);
axis equal;
camproj('persp')
t.FaceVertexCData = 1*(sum(N.*bsxfun(@minus,BC,campos),2)<=0)
apply_ambient_occlusion();


Of course, if you change the view, the coloring is no longer valid:

So you need to recompute the coloring:

You can also insert nans to achieve back-face culling:

t.FaceVertexCData(sum(N.*bsxfun(@minus,BC,campos),2)>0) = nan;


### Shallow depth of field rendering in MATLAB

Saturday, December 27th, 2014

I tried a little experiment with creating lens blur effects using the simple 3D plotter.

Here’s a little chuck of code to move the camera around a point aimed at a target point in the scene and accumulate the result in a buffer.

t = tsurf(F,V,'EdgeColor','none','SpecularStrength',0,'CData',C,fphong);
camproj('persp');
im = [];
count = 0;
sam = 50;
w = 0.2;
target = camtarget;
pos = campos;
[THETA,PHI] = meshgrid(w*linspace(-2*pi,2*pi,sam),w*linspace(-2*pi,2*pi,sam));
prev_theta = 0;
prev_phi = 0;
for s = randperm(numel(THETA))
theta = THETA(s);
phi = PHI(s);
camorbit(-prev_theta,-prev_phi);
prev_theta = theta;
prev_phi = phi;
camorbit(theta,phi);
frame = getframe(gcf);
frame = im2double(frame.cdata);
count = count+1;
if isempty(im)
im = zeros(size(frame));
end
im = im+frame;
end
im = imtrim(im/count,'Threshold',0);
close all;
imshow(im);


You can get away with fewer samples if the “size of the lens” is smaller. This one uses w=0.1; sam=10:

### Ambient occlusion + anti-aliasing in MATLAB

Wednesday, October 15th, 2014

I’m enjoying the new anti-aliased graphics in matlab 2014b. Here’s the xyzrgb dragon rendered with soft lighting, ambient occlusion and a simple colormap:

Here’s the code to reproduce it:

[V,F] = load_mesh('/Users/ajx/Documents/AutoDOF/Code/skinning/skinning/xyzrgb_dragon/xyzrgb_dragon.obj');
AO = ambient_occlusion(V,F,V,per_vertex_normals(V,F),1000);
colormap(parula(9));
t = tsurf(F,V,'EdgeColor','none');
C = squeeze(ind2rgb(floor(matrixnormalize(t.FaceVertexCData)*size(colormap,1))+1,colormap));
t.FaceVertexCData = bsxfun(@times,C,1-AO);
t.SpecularStrength = 0.1;
t.DiffuseStrength = 0.1;
t.AmbientStrength = 0.8;
l = light('Position',[1 1 100],'Style','infinite');
l2 = light('Position',[1 -100 1],'Style','infinite');
set(gca,'XTickLabel',[],'YTickLabel',[],'ZTickLabel',[],'Color',[0.94 0.94 0.94]);
set(gcf,'Color','w');
axis equal
camproj('persp');
t.Vertices = V*axisangle2matrix([0 0 1],pi)*axisangle2matrix([1 0 0],pi/2);
view(-43,6);
axis tight;
drawnow;


The ambient_occlusion call takes quite some time on my little laptop. But I think the result looks nice.

### MATLAB2014b features anti-aliasing

Monday, October 13th, 2014

Finally. I’m pretty happy about the results:

[V,F] = load_mesh('/usr/local/igl/libigl/examples/shared/cheburashka.off');
AO = ambient_occlusion(V,F,V,per_vertex_normals(V,F),1000);
t = tsurf(F,V,fphong,'EdgeColor','none');
C = squeeze(ind2rgb(floor(matrixnormalize(t.FaceVertexCData)*size(colormap,1))+1,colormap));
t.FaceVertexCData = bsxfun(@times,C,1-AO);
t.SpecularStrength = 0.1;
t.DiffuseStrength = 0.1;
t.AmbientStrength = 0.8;
l = light('Position',[1 1 100],'Style','infinite');
l2 = light('Position',[1 -100 1],'Style','infinite');
set(gca,'XTickLabel',[],'YTickLabel',[],'ZTickLabel',[],'Color',[0.94 0.94 0.94]);
set(gcf,'Color','w');


And to spin the camera around:

axis equal
axis vis3d;
camproj('persp');
for f = 1:numel(T)
t = T(f);
view(-cos(t*2*pi)*45,sin(t*2*pi)*45+45);
drawnow;
frame = getframe(gcf);
[SIf,cm] = rgb2ind(frame.cdata,256);
if f == 1
imwrite(SIf,cm,filename,'Loop',Inf,'Delay',0);
else
imwrite(SIf,cm, filename,'WriteMode','append','Delay',0);
end
end


With the awesome but now obsolete myaa.m hacked anti-aliasing, creating this gif would have taken many minutes. This runs in real time.

### Compile and run mesa on bluehost web server

Sunday, October 7th, 2012

I want to use the off-screen renderer of Mesa in a php script on my blue host served website. Compiling Mesa on my mac was dead simple (sudo port install mesa), but doing it on the linux server without root access or repositories was a bit tricky. Here’s how I finally got it to work.

Download and compile llvm, if it’s not around already. I found that version 3.1 didn’t play nicely with Mesa but 3.0 did. LLVM installed smoothly.


./configure --prefix=[INSTALL_PREFIX]
make -j5
make install


Next, grab the latest glproto headers. As far as I can tell, there is nothing to compile as only headers are needed.


export GLPROTO_LIBS=../glproto-1.4.16/;
export GLPROTO_CFLAGS=../glproto-1.4.16/;
% configure, disabling DRI support (i.e. graphics card support)
./configure --prefix=[INSTALL_PREFIX] --disable-driglx-direct --enable-xlib-glx --enable-osmesa --disable-dri
make -j5
make install


Then I got the Mesa demos and made sure I could compile src/osdemos/osdemo.c:


gcc -o osdemo osdemo.c -I[INSTALL_PREFIX]/include -L[INSTALL_PREFIX]/lib -lOSMesa -lGLU


Upon running osdemo, you might see:


./osdemo: error while loading shared libraries: libOSMesa.so.8: cannot open shared object file: No such file or directory


But this is fixed by adding you library install path to the LD_LIBRARY_PATH variable:


export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:[INSTALL_PATH]/lib/  or in php:  putenv("LD_LIBRARY_PATH=".$_ENV["LD_LIBRARY_PATH"].":[INSTALL_PATH]/lib/");


If it works then you can run the program with:


./osdemo foo.tga


and produce an image like:

### Cheap tricks for OpenGL transparency

Wednesday, September 5th, 2012

I’ve been experimenting with some hacks using basic OpenGL commands to achieve high/good-quality transparency (alpha-blending).

I’ll compare 4 methods.

1. Traditional: For this one (and all the others) we turn on alpha blending:

glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);


and then just use a normal depth test without back face culling in a single pass:


glDisable(GL_CULL_FACE);
glDepthFunc(GL_LEQUAL);
// render teapot with alpha = alpha


Notice two things. The handle is not visible in the second picture and we have zebra stripe artifacts in the first picture due to alpha blending without sorting. Sorting is slow and view-dependent and won’t even guarantee correct results. So instead we’ll see how far we can get with cheap tricks.

2. GL_GREATER: This is a 3-pass attempt.

double f = 0.75; // Or some other factor
glDisable(GL_CULL_FACE);
glDepthFunc(GL_LEQUAL);
// render teapot with alpha = f*alpha

glDisable(GL_CULL_FACE);
glDepthFunc(GL_GREATER);
// render teapot with alpha = alpha

glDisable(GL_CULL_FACE);
glDepthFunc(GL_LEQUAL);
// render teapot with alpha = (alpha-f*alpha)/(1.0-f*alpha);


This often produces reasonable results but the idea is a bit strange. It’s sort like depth peeling from opposite ends. Hoping that if we only have two layers, we’ll catch both of them.

Notice now we have the handle showing up in the second image, but we still have ordering artifacts. Admittedly a lot of these would go away with back face culling, but then we don’t get two faced rending (blue inside, gold outside).

3. ALWAYS: This method will seem like a step back because it introduces more ordering artifacts due to the two-face rendering, but the concept is a bit simpler.

double f = 0.75; // Or some other factor
glDisable(GL_CULL_FACE);
glDepthFunc(GL_ALWAYS);
// render teapot with alpha = f*alpha

glDisable(GL_CULL_FACE);
glDepthFunc(GL_LEQUAL);
// render teapot with alpha = (alpha-f*alpha)/(1.0-f*alpha);

4. The idea is again the same that by taking two passes we first pick up a little bit of everything albeit not sorted in the correct order. Then we finish with one pass that ensures that the top layer is correct.

We’re getting artifacts from the ordering in the first image, but I’ll show in a second that these are just coming from not handling the two-face rendering correctly. What we see in both cases is that the bottom of the teapot shows clearly through the bottom.

5. QUINTUPLE PASS: This is the final solution that I arrived at. It involves five passes, which using display lists should be OK for many real-time applications.

glDisable(GL_CULL_FACE);
glDepthFunc(GL_LESS);
// render teapot with alpha = 0, to prime the depth buffer

glEnable(GL_CULL_FACE);
glCullFace(GL_FRONT);
glDepthFunc(GL_ALWAYS);
// render teapot with alpha = f*alpha

glEnable(GL_CULL_FACE);
glCullFace(GL_FRONT);
glDepthFunc(GL_LEQUAL);
// render teapot with alpha = (alpha-f*alpha)/(1.0-f*alpha)

glEnable(GL_CULL_FACE);
glCullFace(GL_BACK;
glDepthFunc(GL_ALWAYS);
// render teapot with alpha = f*alpha

// There's a trade off here. With culling enabled then a perfectly
// opaque object (alpha=1) may be wrong. With it disabled, ordering
// artifacts may appear
// glEnable(GL_CULL_FACE);
// glCullFace(GL_BACK);
glDisable(GL_CULL_FACE);
glDepthFunc(GL_LEQUAL);
// render teapot with alpha = (alpha-f*alpha)/(1.0-f*alpha)


The idea is that first we prepare the depth buffer with the final depth values, then we use the previous method but just to show the inside “back-facing” surface, then we do the same to show the front.

6. Now we get two very nice renderings that even participate with the background well. We can continuously blend our alpha value to 1 and expect to see a perfectly correct opaque rendering. We can notice some small ordering artifacts in the top image near the handle. If we know we want a transparent object then we can flip back face culling on for the last pass to get rid of this. But as noticed this will give an incorrect result when alpha ≈ 1.