post-production techniques, work with max/MSP#2

i will explain here the process of building my patch from small different test patches for my final crit. as i mentioned before, i want to make max/MSP do post production techniques live. i have done a research and i am happy to know that this program is doing chromakeying, colour adjustment, different video layer mixing as well as working with 3-d objects. my project goal is to perform post-production techniques in real-time and implement some interactivity between physical movement of a person and 3-d element movement in the patch. here is how i want to do it::
1) there will be a person standing in front of a green screen
2) i will have a video camera set up which will capture a person against the green screen (my goal is not to film it, i will supply a real-time video feed into my computer via camera)
3) max/MSP patch will detect the live feed and will analyze data. it will track the movement of the person’s hands and legs and translate it into real-time changing values.
4) these values will be applied to 3-d elements i want to place around the person.
5) in output projection we will see a person surrounded by 3-d elements. these elements will be interactively moving around depending on person’s movements. the main goal is to give a person which is a main character in the projection a decision on how he or she wants the environment to correspond.
6) i will do some real-time colouring and brightness/contrast adjustment.
—–
for my project, firstly,  i have to find out how to do a chromakeying and if it is possible to chromakey a live footage. here is a patch i was working on and testing a chromakeying on video and then on a live feed.


in this screenshot you can see a max/MSP patch which processes two video sources. one video is just a movie file (green screen footage called green.mov) and another one is a live built-in camera feed from computer. patch is chromakeying a first video by “taking out” a green colour. chromakeying is the process of superimposing one image on top of another by selective replacement of colour and it is done in Jitter by jit.chromakey object. by specifying a colour and a few other parameters, jit.chromakey detects cells containing that colour in the first (left-hand) matrix and replaces them with the equivalent cells in the second (right-hand) matrix when it constructs the output matrix. the result is that selective colour cells of green.mov video file are superimposed onto the live feed as you can see in my patch bottom window. max/MSP does chromakeying using a suckah tool from the object palette.

when i pick the suckah tool it will appear like this in patch editing mode.

suckah tool allows me to get the rgb color beneath it, or feed in any screen coordinates to get the rgb values of that pixel. what is interesting is that i can take any pixel readings with this tool by placing it on any desired area within a patch (even on pixels, which are not in the real video footage) and use it as a value of colour reading which has to be keyed out in the video. chromakeying has to be done with help of jit.chromakey object.

the jit.chromakey object measures the chromatic distance of each of the left input’s cells (pixels) with a reference color (a.k.a “green screening”). the total chromatic distance is calculated by summing the absolute value of each color channel’s distance from the reference color’s corresponding color channel. if the distance is less than or equal to a tolerated distance ( tol ) value, the right input cell is multiplied by a maximum keying ( maxkey ) value.
to key out live feed was relatively easy, i just replaced source of video with live feed from camera and applied same keying objects on it. here is a patch showing this process:

in live feed footage i picked random pixel to see the keying effect. obviously i didn’t have a consistent background which will be in my final piece, but all i wanted to see if keying is possible on live footage and if that doesn’t slow down the process. keying was made without problems. my first challenge has been sorted about chromakeying live feed.

next step i was moving to is trying to import my 3-d objects in the max/MSP patch. max supports 3-d files with .obj extension.  i found out that maya allows to export scene as .obj but certain plug-ins must have been activated beforehand.  i did that and successfully exported my 3-d cubes. all what max does is takes the polygons and places them in 3-d space. i can apply my own texturing and control their movement in 3-d space, that is exactly what i need for my project.

cubes are made by myself but mushrooms are given .obj files by default in max/MSP. here i was testing placing two 3-d scenes together and it works very good. at the moment both scenes rotates with help of mouse at the same time, but i can reset them to original position separately. i will need to look into how to manipulate different 3-d objects separately. the rotation is done by jit.gl.handle object.
this object responds to mouse clicks and drags in the destination by generating rotate and position messages out its left outlet. then the messages are sent to 3-d objects, the objects can then be rotated and moved in space using mouse. next step is to solve my live green screen feed. first idea was to create a plain 3-d plane and import as my second 3-d object in patch. then, knowing that i can apply texture on my 3-d objects, i will try to apply live texture instead of static image. i will need to research on textures and how to replace the image texture with live video texture. here is the patch which successfully does this process:

i managed to apply live camera feed on top of my 3-d plane and i have my 3-d cubes around this plane in same environment. applying texture whether it would be just a static image, video or live feed is not so hard at all. i am using prepend texture object to assign any incoming in its left inlet data as a texture. in patch above there is a link going out of prepend texture object and going in jit.gl.model ml object, which is the rendering engine for 3-d objects in my patch. in this case texture is applied directly on to plane. sometimes patch can get messy with linking chords and it is useful to know that you can send messages across whole patch without links. to send prepend texture message across render object, i will create texture message box and link in to render engine as shown here: . message takes data from objects containing the keyword “mytexture”. i can name it anything i want, it is something similar to flash coding, when you assign variables and use preset values at any stage of development or any place in patch in max/MSP case.
next step is chromakeying live footage. i have tried many attempts and they produce different result, i want to explain the chromakeying problems in next post.

Advertisements

2 Responses to “post-production techniques, work with max/MSP#2”

  1. Mitchell Says:

    Hi,

    I’m loving your project. I’m working on a chromakey project in Max right now, and I was wondering if you could help me a little bit. I can’t seem to get the jit.pwindow to go fullscreen. Is there a trick to this? Thanks.

  2. kavi999 Says:

    hey, Mitchell!
    sure, just explain me what exactly you want to achieve and send me a screenshot of ur patch or a patch sort code and i will see if i can help you at all! my e-mail is: kavisound@inbox.lv
    laters
    kavs

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


%d bloggers like this: