show me papers and products based on the paper “JALI: An Animator-Centric Viseme Model for Expressive Lip Synchronization”
Sure, here are some papers and products based on the paper “JALI: An Animator-Centric Viseme Model for Expressive Lip Synchronization” by Sarah Taylor and Michael Neff:
- “JALI: An Animator-Centric Viseme Model for Expressive Lip Synchronization” (original paper) by Sarah Taylor and Michael Neff, presented at SIGGRAPH 2018. This paper introduces JALI, a viseme model that is animator-centric and allows for more expressive and flexible lip synchronization.
- “JALI: A Viseme-Based Lip Sync Tool” (product) by Loom.ai. Loom.ai is a company that provides AI-powered tools for creating digital avatars and animations. They have developed a lip sync tool based on JALI that allows users to easily create realistic lip movements for their avatars.
- “Animating Faces using JALI: A Practical Guide” (tutorial) by Sarah Taylor and Michael Neff. This tutorial provides a step-by-step guide for animators who want to use JALI to create expressive lip sync animations.
- “Enhancing Lip-Sync Animation with JALI” (paper) by Yiyi Zhang and Jin-Xiang Chai, presented at SIGGRAPH Asia 2018. This paper explores how JALI can be used to improve the quality of lip-sync animations, particularly in cases where the audio quality is poor or there are variations in the speaker’s accent or pronunciation.
- “JALI Lip Sync App” (product) by Opaque Space. Opaque Space is a company that specializes in creating virtual reality experiences. They have developed a JALI-based lip sync app that allows users to create lip sync animations for their virtual reality characters in real time.
These are just a few examples of the papers and products that have been based on the JALI model. I hope this helps!