Files changed (1) hide show
  1. README.md +100 -0
README.md ADDED
@@ -0,0 +1,100 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <h1 align="center">
2
+ <span style="color:#2196f3;"><b>Refaçade</b></span>: Editing Object with Given Reference Texture
3
+ </h1>
4
+
5
+ <p align="center">
6
+ Youze Huang<sup>1,*</sup>
7
+ Penghui Ruan<sup>2,*</sup>
8
+ Bojia Zi<sup>3,*</sup>
9
+ Xianbiao Qi<sup>4,†</sup>
10
+ Jianan Wang<sup>5</sup>
11
+ Rong Xiao<sup>4</sup> <br>
12
+ <sup>*</sup> Equal contribution. <sup>†</sup> Corresponding author.
13
+ </p>
14
+ <p align="center">
15
+ <span><sup>1</sup> University of Electronic Science and Technology of China</span>&emsp;
16
+ <span><sup>2</sup> The Hong Kong Polytechnic University</span><br>
17
+ <span><sup>3</sup> The Chinese University of Hong Kong</span>&emsp;
18
+ <span><sup>4</sup> IntelliFusion Inc.</span>&emsp;
19
+ <span><sup>5</sup> Astribot Inc.</span>
20
+ </p>
21
+
22
+ <p align="center">
23
+ <a href="https://huggingface.co/fishze/Refacade"><img alt="Huggingface Model" src="https://img.shields.io/badge/%F0%9F%A4%97%20Huggingface-Model-brightgreen"></a>
24
+ <a href="https://github.com/fishZe233/Refacade"><img alt="Github" src="https://img.shields.io/badge/Refaçade-github-black"></a>
25
+ <a href="https://arxiv.org/abs/2512.04534"><img alt="arXiv" src="https://img.shields.io/badge/Refaçade-arXiv-b31b1b"></a>
26
+ <a href="https://huggingface.co/spaces/Ryan-PR/Refacade"><img alt="Huggingface Space" src="https://img.shields.io/badge/%F0%9F%A4%97%20Huggingface-Space-1e90ff"></a>
27
+ <a href="https://refacade.github.io/"><img alt="Demo Page" src="https://img.shields.io/badge/Website-Demo%20Page-yellow"></a>
28
+ </p>
29
+
30
+ https://github.com/user-attachments/assets/e1e53908-2c78-4433-947d-11d124a4dd32
31
+
32
+ ## 🚀 Overview
33
+
34
+ **Refaçade** is a unified image–video retexturing model built upon the Wan2.1-based VACE framework. It edits the surface material of specified objects in a video using user-provided reference textures, while preserving the original geometry and background. We use **Jigsaw Permutation** to decouple structural information in the reference image and a **Texture Remover** to disentangle the original object’s appearance. This functionality enables users to explore diverse possibilities effectively.
35
+
36
+ ---
37
+
38
+ ## 🛠️ Installation
39
+
40
+ Our project is built upon [Wan2.1-based VACE](https://github.com/ali-vilab/VACE).
41
+
42
+ ```bash
43
+ pip install -r requirements.txt
44
+ pip install wan@git+https://github.com/Wan-Video/Wan2.1
45
+ ```
46
+
47
+ ---
48
+
49
+ ## 🏃‍♂️ Gradio Demo
50
+
51
+ You can use this gradio demo to retexture objects. Note that you don't need to compile the SAM2.
52
+ ```bash
53
+ python app.py
54
+ ```
55
+
56
+ ---
57
+
58
+ ## 📂 Download
59
+
60
+ First, download our checkpoints:
61
+ ```shell
62
+ huggingface-cli download --resume-download fishze/Refacade --local-dir models
63
+ ```
64
+ Next, download SAM2 [sam2_hiera_large.pt](https://huggingface.co/facebook/sam2-hiera-large) and place it at:
65
+ ```shell
66
+ sam2/SAM2-Video-Predictor/checkpoints/
67
+ ```
68
+
69
+ We recommend to organize local directories as:
70
+ ```angular2html
71
+ Refacade
72
+ ├── ...
73
+ ├── examples
74
+ ├── models
75
+ │ ├── refacade
76
+ │ │ └── ...
77
+ │ ├── texture_remover
78
+ │ │ └── ...
79
+ │ └── vae
80
+ │ └── ...
81
+ ├── sam2
82
+ └── ...
83
+ ```
84
+
85
+ ---
86
+
87
+ ## ⚡ Quick Start
88
+
89
+ ### Minimal Example
90
+
91
+ ```bash
92
+ python test_pipe.py \
93
+ --ref_img ./assets/single_example/1.png \
94
+ --ref_mask ./assets/single_example/mask.png \
95
+ --video_path ./assets/single_example/1.mp4 \
96
+ --mask_path ./assets/single_example/mask.mp4 \
97
+ --output_dir ./outputs
98
+ ```
99
+
100
+ ---