Navigate:
LivePortrait
~$LIVEP0.0%

LivePortrait: Portrait Animation with Motion Control

PyTorch implementation for animating portraits by transferring expressions from driving videos.

LIVE RANKINGS • 10:20 AM • STEADY
OVERALL
#444
8
AI & ML
#115
30 DAY RANKING TREND
ovr#444
·AI#115
STARS
17.8K
FORKS
1.9K
7D STARS
0
7D FORKS
0
Tags:
See Repo:
Share:

Learn more about LivePortrait

LivePortrait is a deep learning framework for portrait animation that transfers motion from driving videos to static portrait images. The system uses neural networks with stitching and retargeting control mechanisms to maintain portrait identity while applying facial expressions and head movements. It supports both human and animal portraits, with separate models trained for different subject types. The framework includes features for regional control, pose editing, and video-to-video portrait editing workflows.

LivePortrait

1

Stitching Control

Implements neural stitching mechanisms to maintain portrait identity and visual consistency during animation. The system preserves facial features while applying motion transformations.

2

Multi-Subject Support

Includes separate models for human and animal portrait animation. Both models handle facial expressions and head movements with subject-specific optimizations.

3

Regional Control

Provides granular control over different facial regions during animation. Users can selectively apply motion to specific areas like eyes, mouth, or head pose independently.



See how people are using LivePortrait

Loading tweets...


[ EXPLORE MORE ]

Related Repositories

Discover similar tools and frameworks used by developers