Skip to content
/ sam-pt Public

SAM-PT: Extending SAM to zero-shot video segmentation with point-based tracking.

License

Notifications You must be signed in to change notification settings

SysCV/sam-pt

Repository files navigation

Segment Anything Meets Point Tracking

Segment Anything Meets Point Tracking
Frano Rajič, Lei Ke, Yu-Wing Tai, Chi-Keung Tang, Martin Danelljan, Fisher Yu
ETH Zürich, HKUST, EPFL

SAM-PT design

We propose SAM-PT, an extension of the Segment Anything Model (SAM) for zero-shot video segmentation. Our work offers a simple yet effective point-based perspective in video object segmentation research. For more details, refer to our paper.

Video Object Segmentation Demo

Annotators only provide a few points to denote the target object at the first video frame to get video segmentation results. Please visit our project page for more visualizations, including qualitative results on DAVIS 2017 videos and more Avatar clips.

street bees avatar horsejump-high

Interactive Point-Based Video Segmentation

Annotators can interactively add or remove points to refine the segmentation results.

camel drift loading

Documentation

Explore our step-by-step guides to get up and running:

  1. Getting Started: Learn how to set up your environment and run the demo.
  2. Prepare Datasets: Instructions on acquiring and prepping necessary datasets.
  3. Prepare Checkpoints: Steps to fetch model checkpoints.
  4. Running Experiments: Details on how to execute experiments.

Acknowledgments

We want to thank SAM, PIPS, CoTracker, HQ-SAM, MobileSAM, XMem, and Mask2Former for publicly releasing their code and pretrained models.

Citation

If you find SAM-PT useful in your research or if you refer to the results mentioned in our work, please star ⭐ this repository and consider citing 📝:

@article{sam-pt,
  title   = {Segment Anything Meets Point Tracking},
  author  = {Rajič, Frano and Ke, Lei and Tai, Yu-Wing and Tang, Chi-Keung and Danelljan, Martin and Yu, Fisher},
  journal = {arXiv:2307.01197},
  year    = {2023}
}