SceneNN: A Scene Meshes Dataset with aNNotations

Binh-Son Hua1, Quang-Hieu Pham1, Duc Thanh Nguyen2, Minh-Khoi Tran1, Lap-Fai Yu3, and Sai-Kit Yeung1

1Singapore University of Technology and Design 2Deakin University 3University of Massachusetts Boston

We introduce an RGB-D scene dataset consisting of more than 100 indoor scenes. Our scenes are captured at various places, e.g., offices, dormitory, classrooms, pantry, etc., from University of Massachusetts Boston and Singapore University of Technology and Design.
All scenes are reconstructed into triangle meshes and have per-vertex and per-pixel annotation. We further enriched the dataset with fine-grained information such as axis-aligned bounding boxes, oriented bounding boxes, and object poses.

News
Dataset & Tools
cloud
Dataset
(120 GB)
cloud_download
Download script
(Python)
create
Annotation tool
(Windows x64)

Publication
Discussion

Please email us at scenenn [at] gmail.com for any inquiries. You can also post to the discussion board below.


Call for Contributions

If you find this dataset useful and would like to contribute, please do not hesitate to let us know. Below are some potential ideas that you can help with to expand the dataset:

Acknowledgements

We are grateful to the anonymous reviewers for their constructive comments. We thank Fangyu Lin for his assistance with the data capture and development of the WebGL viewer, and Guoxuan Zhang for his help with the early version of the annotation tool.

Lap-Fai Yu is supported by the University of Massachusetts Boston StartUp Grant P20150000029280 and by the Joseph P. Healey Research Grant Program provided by the Office of the Vice Provost for Research and Strategic Initiatives & Dean of Graduate Studies of the University of Massachusetts Boston. This research is supported by the National Science Foundation under award number 1565978. We also acknowledge NVIDIA Corporation for graphics card donation.

Sai-Kit Yeung is supported by Singapore MOE Academic Research Fund MOE2013-T2-1-159 and SUTD-MIT International Design Center Grant IDG31300106. We acknowledge the support of the SUTD Digital Manufacturing and Design (DManD) Centre which is supported by the National Research Foundation (NRF) of Singapore. This research is also supported by the National Research Foundation, Prime Minister's Office, Singapore under its IDM Futures Funding Initiative.