January 8th, 2014 at 5:00pm PDT, Agenda and Minutes

From Web3D.org
Jump to: navigation, search

Dick Puk, Don Brutzman, William Glascoe, Myeong Won Lee, Joe Williams, and Prof. Jung-Ju Choi attended.

1. Facial animation using current H-Anim

Jung-Ju explained document with design ideas regarding facial animation.

Joe Williams discussed a number Jung-Ju's questions about his proposal. Check out MPEG-4 Facial Animation Parameter overview at the following URL.

Caution please: Everyone has to be very careful not to infringe on MPEG4 FAP Intellectual Property Rights in facial animation.

2. Relationship to current and future H-Anim

Don explains how feature points are animated (via HAnimSite and HAnimDisplacer nodes) and how a comparison (CoordinateInterpolator and IndexedFaceSet/mesh) might show how much value is added by a mesh displacer.

A prerequisite for CoordinateInterpolator is that every vertex in the mesh must be animated.

  • In X3D, a CoordinateInterpolator can animate a mesh. This is an interesting alternative that can be compared in a test.
  • The red dots in Jung-Ju's figure shown today correspond to feature points
  • The number of black vertices might vary depending on an implementation
  • The most interesting choice for an H-Anim perspective might be choice of those feature points
    • H-Anim has about about 8 (or 9) feature points for face. See H-Anim Specification Figure B.1 and feature points 1..8, which are listed in Table B.1.
    • MPEG4 has about 50 feature points for face, but there are patents restricting their specific use in an open standard

So our first main design decision for H-Anim might be whether to

 1. pre-specify a set of feature points for common use (and re-use)
 2. or allow an author to define them (which might not be valuable, they can define geometry anyway

(I'm guessing we'd want option #1, but it is possible that we might find a way to do something like option #2.)

There are also a finite number of human facial expressions

  • there are multiple models published about this (see SIGGRAPH papers, can we make a list?)
  • each facial expression has a name
  • there is a set of feature points common to these

So our next next decision might be

  3. to define a set of sites, likely as a superset of the existing 8 points in H-Anim
  4. to define a set of expressions, each named, each with a finite number of the feature points

Possibility: if we added a little more to Jung-Ju's point paper (perhaps steps 1 3 and 4 above) then we could write "expression animators" that describe the end locations of each feature point.

Joe's comments on this topic

I like how the proposed parametric animation could work with the Displacer.

Displacer nodes require detailed knowledge of the parent mesh. Specifically, the vertex order of the parent mesh must not change between authoring and rendering. Compared to CoordinateInterpolator, for example, since all vertices of the mesh may not need to be animated for a given sequence, only the vertices to be moved are included.

HAnimDisplacer

coordIndex    [
  numerical index of each 
  parent geometry vertex 
  to be animated
  index is order of appearance of vertex 
  in user code for parent mesh 
              ]
displacements [
  x,y,z location in skeleton space 
  of each vertex at maximum displacement
  same order as coordIndex
              ]
weight        0 to 1 animation control (from ScalarInterpolator)
  (linear interpolation)
  0 = same location as parent mesh initial vertex 
  1 = defined maximum location of parent mesh vertex 

Parent geometry is either the Humanoid, if the Displacer is applied to the continious mesh in the skin node, or, the geometry of the Displacer node's parent Segment.

Some old work: http://h-anim.org/Specifications/H-Anim1.1/appendices.html#appendixb

3. Standardization item for facial animation

We will discuss further in Korea. Points of interest:

  • the only skeletal motion in the face is the jaw but the rotation of the tongue, eyeballs, eyelids and ... are addressable
  • the boundaries of the facial muscles could for the basis of the facial animation parameters (perimeters of primitive zones)
  • the boundaries of facial skin as mapped by estheticians could form the basis of the H-anim facial animations primitives

4. NWIP discussion

Dick reports that both New Work Item Proposals have passed ISO SC 24 review.

Congratulations everyone! Thanks Myeong Won Lee for drafting the original documents!

5. Parametric Human Project

William mentioned collaborating with the Parametric Human Project as they amass microCT data of human skeletons. See a paper describing the project scope and goals.

6. Scheduling next meeting

Next meeting will be held in person, co-located at the meetings below, as well as via Web3D teleconference. January 20th (Monday) at 4:00pm PST (21th, 9:00am KST)


This wiki approach worked today, we will try to continue that to make simultaneous contributions and discussion easier.