You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
> A sign language interpreter using live video feed from the camera.
5
-
The project was completed in 24 hours as part of HackUNT-19, the University of North Texas's annual Hackathon. You can view the project demo on [YouTube](http://bit.ly/2Iaz2UK).
5
+
The project was completed in 24 hours as part of HackUNT-19, the University of North Texas's annual Hackathon. You can view the project demo on [YouTube](http://bit.ly/30xYgT8).
6
6
7
7
## Table of contents
8
8
*[General info](#general-info)
@@ -33,7 +33,7 @@ We wanted to make it easy for 70 million deaf people across the world to be inde
33
33

34
34
35
35
36
-
**The entire demo of the project can be found on [YouTube](http://bit.ly/2Iaz2UK).**
36
+
**The entire demo of the project can be found on [YouTube](http://bit.ly/30xYgT8).**
37
37
38
38
39
39
## Screenshots
@@ -160,7 +160,7 @@ Features that can be added:
160
160
* Add more sign languages
161
161
162
162
## Status
163
-
Project is: _finished_. Our team was the winner of the UNT Hackaton 2019. You can find the our final submission post on [devpost](http://bit.ly/2WWllwg).
163
+
Project is: _finished_. Our team was the winner of the UNT Hackaton 2019. You can find the our final submission post on [devpost](http://bit.ly/2QeITdz).
164
164
165
165
## Contact
166
166
Created by me with my teammates [Siddharth Oza](https://github.com/siddharthoza), [Ashish Sharma](https://github.com/ashish1993utd), and [Manish Shukla](https://github.com/Manishms18).
0 commit comments