You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
> A sign language interpreter using live video feed from the camera.
6
-
The project was completed in 24 hours as part of HackUNT-19, the University of North Texas's annual Hackathon. You can view the project demo on [YouTube](http://bit.ly/2Iaz2UK).
6
+
The project was completed in 24 hours as part of HackUNT-19, the University of North Texas's annual Hackathon. You can view the project demo on [YouTube](http://bit.ly/30xYgT8).
7
7
8
8
## Table of contents
9
9
*[General info](#general-info)
@@ -34,7 +34,7 @@ We wanted to make it easy for 70 million deaf people across the world to be inde
34
34

35
35
36
36
37
-
**The entire demo of the project can be found on [YouTube](http://bit.ly/2Iaz2UK).**
37
+
**The entire demo of the project can be found on [YouTube](http://bit.ly/30xYgT8).**
38
38
39
39
40
40
## Screenshots
@@ -161,7 +161,7 @@ Features that can be added:
161
161
* Add more sign languages
162
162
163
163
## Status
164
-
Project is: _finished_. Our team was the winner of the UNT Hackaton 2019. You can find the our final submission post on [devpost](http://bit.ly/2WWllwg).
164
+
Project is: _finished_. Our team was the winner of the UNT Hackaton 2019. You can find the our final submission post on [devpost](http://bit.ly/2QeITdz).
165
165
166
166
## Contact
167
167
Created by me with my teammates [Siddharth Oza](https://github.com/siddharthoza), [Ashish Sharma](https://github.com/ashish1993utd), and [Manish Shukla](https://github.com/Manishms18).
0 commit comments