Oct 122012
 

PyQt4 UI Development for Maya

Just released my 3rd python-based online training video through cmiVFX.com

Introduction

This tutorial is about learning PyQt4 python bindings for the Qt Framework, and how to introduce new UI elements to Maya as a platform.
We discuss what comprises a “Framework” and a “GUI Framework”, and how Qt and PyQt4 work together.

 

Getting Started With PyQt4

There are multiple ways of getting a working installation of PyQt4, both for the general system and for Maya. We look into these approaches to get your system up and running to begin working with PyQt4!
We also talk about what is included, such as command line tools and applications, tips on how to test and learn the code, and how to structure a project.

 

PyQt4 Fundamentals

Lets get crackin’ and learn the basics!
• What is a QObject? What is a QWidget? Common PyQt4 classes are explained in detail
• Working with the Qt Designer application, to build a UI visually
• Layouts: Making widgets resize elegantly and stay organized in your design
• Coordinate space: How do widgets transform in your 2D screen space?
• QApplication and the Qt Event Loop: The engine that runs your UI
• Events, Signals, and Slots: How components communicate changes and how the application can respond to changes to make it dynamic

 

General Examples

With an understanding of the framework components, we can begin working with fully functional stand-alone examples.
• Common PyQt4 app template
• Subclassing Widgets: Creating custom functionality to the existing classes provided by PyQt4
• Dialogs: Raising dialog windows above existing windows, Modal vs Non-modal, and creating forms. We look at different ways to validate the data provided by the user, to these dialog forms.

 

PyQt4 And Maya Introduction

Finally, some Maya action! Maya has a slightly different approach to using PyQt4…
• How does the QApplication and event loop work?
• Common Maya PyQt4 app template
• Looking at the Maya API’s MQtUtil class
• The sip module: Helping us translate between Maya’s Qt and our own PyQt4 code

 

Replicating Maya’s UI Components

What better way to see examples of creating UI for Maya than to replicate some existing functionality? This gives us the opportunity expand with custom functionality
In this chapter we will take two different UI components in Maya, and do a basic custom version of our own, and show to how link them up to Maya’s own callbacks.
Some Features Of This Chapter Include
• The QTableWidget
• Model / View separation with QTreeView
• Docking windows into the Maya interface
• Mixing together PyQt4, the Maya API, Maya commands, and callbacks
• Sorting model data

 

Customizations

A button can be a button, and a slider might look alright in its stock form, but sometimes we want to customize the look of our widgets. This chapter introduces multiple ways of achieving custom looks to our components
• Stylin’ Stylesheets: Use CSS-like syntax for applying style sheets to widgets
• Painting By … Paint events: For even more control, we can tell a widget exactly how to draw itself on the screen. We will look at two different examples of how to use custom painting.

Previous cmiVFX tutorials:

Jul 252012
 
python-logo

In a recent python project where I was sending multiple messages per second of data over a basic socket, I had initially just grabbed the cPickle module to get the prototype proof-of-concept functioning properly. cPickle is awesome for easily serializing more complex python objects like custom classes, even though in my case I am only sending basic types.

My messages were dicts with some nested dicts, lists, floats, and string values. Roughly 500-1000 bytes. cPickle was doing just fine, but there came a point where I wanted to investigate the areas that could be tightened up. The first thing I realized was that I had forgotten to encode cPickle in the binary format (the default is ascii). That saved me quite a bit of time. But then I casually searched online to see if any json options might be better since my data is pretty primitive anyways.

I found UltraJSON, which is a pure C json parsing library for python, and ran some tests. There are benchmarks on the project page for ujson, as well as other articles on the internet, but I just wanted to post up my own results using a mixed type data container. ujson came out extremely fast: faster than binary cPickle and msgpack, in the encoding test. Although in the decoding test, msgpack appeared to be fastest, followed by binary cPickle, and then ujson coming in 3rd

This test included the following:

Here is my Python 2.7.2 test script using timeit for each encode and decode step.
Jun 212012
 
kinect-xl

A recent project of mine involves research and development with an XBOX 360 Kinect Sensor. Being a python guy, I started searching for python bindings to some OSX-supported framework. When you just get started looking into this area it can be a little confusing. There are a number of layers to the software stack to enable one to accomplish anything meaningful. This is just a short and general blog post outlining the basics of what I have discovered thus far, to help anyone else that might also be getting started.

At the lowest level, you need a driver. Something that can talk to the USB device that is the Kinect sensor. When you purchase the XBOX Kinect for Windows version of the sensor, and you are going to be developing on windows, much of this whole stack is provided to you by way of the Kinect SDK. But for the open source folks with the standard XBOX 360 sensor, you need to piece together your own solution.

Two drivers that I have discovered thus far:

I had started OpenKinect (libfreenect) because it comes with a python wrapper included. There were a few dependencies (I will talk about specific build steps in just a moment), but once I got this installed I was able to fire up the included  glview app and see both depth and rgb data streaming in from my sensor. The role of these drivers is to provide simply the basic streams. That is, the depth, rgb, audio, and a few other sensor data streams. If your goal is to start tracking players, seeing skeletons, and registering gestures, the drivers are not enough. You would be required to make your own solution from this raw data at this phase in the game.

You would now want to look into middleware that can take the raw data and provide to you an API with higher level information. This would include finding users in the scene for you, tracking their body features, and giving you various events to watch for as the data streams.

Being that my goal was to have python bindings, I found my options to be much more limited than if I were going to be developing in C++. Wrappers have to exist for the framework you want. This is where my research really started ramping up. I spent a few days dealing wtih compiling issues, as well as having an actual bad power adapter that had to be exchanged. But all said and done, here is what I have settled on thus far…

  1. Driver: PrimeSense Sensor
  2. OpenNI Framework
  3. NITE middleware for OpenNI
  4. PyOpenNI python bindings

Install Details

Install homebrew (package manager)

http://mxcl.github.com/homebrew/

Install build tools

Install python2.7

Suggestion: virtualenv Environment

This is not a requirement. But I recommend using virtualenv to set up an environment that specifically uses python2.7 so that you don’t have to fight with mixed dependencies and versions.

Create a virtualenv called “kinect”

Install libusb (patched version)

There is a special patched version of the libusb library, in the form of a homebrew formula.

Now copy platform/osx/homebrew/libusb-freenect.rb -> /usr/local/Library/Formula/

Install SensorKinect drivers

Then uncompress Bin/SensorKinect093-Bin-MacOSX-v*tar.bz2

Install OpenNI framework
  1. Go here: http://www.openni.org/Downloads/OpenNIModules.aspx
  2. Download Unstable Binary for MacOSX
  3. sudo ./install.sh
Install NITE middleware (for OpenNI)
  1. Go here: http://www.openni.org/Downloads/OpenNIModules.aspx
  2. Download Unstable MIDDLEWARE of NITE for OSX
  3. sudo ./install.sh
Install PyOpenNI

Be aware that on OSX, PyOpenNI requires a framework build of python 2.7+ and that you must build it for x86_64 specifically. Also, I was having major problems with cmake properly finding the python includes location. I had to suggest a fix, so please see here for the necessary corrections. I have referenced a patched fork of the repository below.

copy the lib/openni.so module to the python2.7 site-packages

Examples

Once you have everything installed, you can try out the examples that are included both in the NITE source location that you downloaded and also in the PyOpenNI source location:

  1. NITE/Samples
  2. PyOpenNI/examples
I also tried out ofxKinect (github.com/ofTheo/ofxKinect) on the side, which is an addon for  OpenFrameworks. This is kind of a separate path than the OpenNI stack. I would say its more like an advanced offering of libfreenect. Using the included example, I recorded a 3D point cloud that is built on the fly from the RGB and depth data:

 

Nov 202011
 

A question came up in the Maya-Python mailing list that I thought was a really good topic, and should be reposted.

Someone asked how you can create maya UI objects and embed them within your main PyQt application. Specifically he wanted to create a modelPanel and embed it so that he would have a camera view within his own PyQt window.

Here is my example of how to achieve this…

You need sip and the MQtUtil functions to convert between maya node paths and python Qbjects. Its the same idea as having to use those functions to get a reference to the maya MainWindow, in order to parent your dialog.

Nov 152011
 

Second video in the python for maya series, just released through cmiVFX!

Python For Maya – Volume 2

If you watched the first video, you now have a good grasp on Python. Sweet. Let’s plow through some more involved concepts like python juggernauts!

With a working knowledge of the python scripting language, and the Maya Python commands API, we can continue to learn new ways to solve more challenging problems, create complete scripts, and build user interfaces around our tools. We also introduce the Maya Python API; a lower-level interface into Maya.

This video focuses more on breaking down full scripts, as opposed to typing out syntax. Its jam packaged with information and moves fast to deliver you as much brain food as possible. The first segment of the video transitions from beginning to intermediate level, with the majority of the video being intermediate, and finishing out by touching on advanced concepts. The included project files are abundant, complete, and full of helpful documentation so that you can take your time and learn about each piece of the tools.

If you check it out, leave me feedback!
http://cmivfx.com/store/328-Python+For+Maya+Vol+02

First video can be found here

Nov 092011
 

This is a follow up post to my previous one on Installing PyQt4 for Maya 2011

Recently while putting together my next video tutorial for Python for Maya, I came to a section where I wanted to demo PyQt4 in Maya2012. But I was concerned that viewers would have to go through the complicated steps of building PyQt4. I noticed that other people have made available precompiled PyQt installers for windows (here) but I could not find any for OSX or linux. So I decided to put together a build.

I created a new project on github called MyQt4
https://github.com/justinfx/MyQt4

Its a Makefile for completely downloading and building PyQt4 for maya, and generating a .pkg installer. Hopefully someone can contribute improvements since I dont have a ton of experience writing makefiles, and also that someone might create a linux version.

Here is a link to the latest pkg build:

Snow Leopard: 

Lion:  

Mountain Lion:

Here are builds other people have made:

Oct 082011
 

Just released my first online video tutorial, through cmiVFX

Python Introduction Vol 01 – Maya

Amazing at Animation? Master of Modeling? Conquistador of Character Rigging?

But how is your Python?

This course brings the talented artist into the fold of the technical-side of Maya. Learn the basics of Python, and its place in your 3D workflow, with visual examples and real world problems. Get a kick-start on adding some automation into your life, and solving common problems in a fraction of the time. By the end of this video, you should have a deeper understanding of one of the languages Maya speaks under the hood, and how to start viewing your scenes in terms of glorious Python code!

Check it out: http://cmivfx.com/store/320-Python+Introduction+Vol+01+-+Maya

If you check out this course, please leave me some feedback! I would love to hear your thoughts.
Stay tuned for more installments to come!

Aug 312010
 
AtomSplitter

 

AtomSplitter has been updated to v1.6, available through cmivfx.com

AtomSplitter 1.6 – cmiVFX.com

Updates:

  • Exports Terragen .tgd scene file format

See the original post.

May 112010
 
AtomSplitter

 

AtomSplitter (chanToFbx) has been updated to v1.2, available through cmivfx.com

AtomSplitter 1.2 – cmiVFX.com

Updates:

  • Camera rotation order set to ZXY, which is the Nuke camera default
  • Fixed a bug where the FocalLength value was not being keyframed properly
  • Added a scene scale field, for adjusting the translation values globally.

If you haven’t visited cmiVFX.com before, PLEASE check them out. Chris Maynard does an amazing job rounding up top talent in the industry to create these outstanding visual fx tutorials. The information is always cutting edge.

See the original post.