{"id":2671,"date":"2022-07-22T14:22:37","date_gmt":"2022-07-22T14:22:37","guid":{"rendered":"https:\/\/commons.princeton.edu\/epics\/?page_id=2671"},"modified":"2022-07-22T14:22:37","modified_gmt":"2022-07-22T14:22:37","slug":"3d-scanning-photogrammetry-and-the-metaverse","status":"publish","type":"page","link":"https:\/\/commons.princeton.edu\/epics\/3d-scanning-photogrammetry-and-the-metaverse\/","title":{"rendered":"3D Scanning, Photogrammetry, and the Metaverse"},"content":{"rendered":"<p><span style=\"text-decoration: underline\"><strong>Research and Experimentation for a Museum in Virtual Reality<\/strong><\/span><\/p>\n<p>Students:<\/p>\n<p>Carl Zielinski &#8217;24<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone  wp-image-2681\" src=\"http:\/\/commons.princeton.edu\/epics\/wp-content\/uploads\/sites\/113\/2022\/07\/Screen-Shot-2022-07-22-at-10.21.50-AM.png\" alt=\"\" width=\"165\" height=\"222\" \/><\/p>\n<p>&nbsp;<\/p>\n<p><strong>Background:<\/strong><\/p>\n<p>One of the biggest buzzwords in the past year or so has been \u201cthe Metaverse\u201d: a theoretical vision for the future of the internet involving immersive, virtual worlds focused on social interaction. While Zuckerberg brought the concept mainstream rather recently, I\u2019ve been interested in the topic for a good bit now due to its overlap with virtual and augmented reality.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone  wp-image-2672\" src=\"http:\/\/commons.princeton.edu\/epics\/wp-content\/uploads\/sites\/113\/2022\/07\/Screen-Shot-2022-07-22-at-10.15.00-AM.png\" alt=\"\" width=\"386\" height=\"124\" \/><\/p>\n<p>I actually gave a small talk on the Metaverse before it was cool. I guess that makes me a hipster now?<\/p>\n<p>(Screenshot taken from the INTERFACE website)<\/p>\n<p>&nbsp;<\/p>\n<p>One overlapping topic of great interest to me is the idea of making photorealistic worlds for<br \/>\nvirtual reality. Photorealistic worlds have many use cases, such as letting users visit places that they\u2019re physically unable to travel to, and helping to preserve and spread cultural heritage. I\u2019ve done a fair share of exploring in VR proto-Metaverses, and I\u2019ve stumbled on a few very convincing photorealistic worlds made using 3D scans of real-world locations. After experiencing this and the recent trend of VR art exhibits, I decided that I wanted to work toward creating a VR exhibit featuring 3D scanning. Since we had a few 3D scanners available, I\u2019ve aimed to make a museum of sorts showcasing the evolution of consumer-grade scanning technology &#8211; ending with the current LIDAR sensors in modern Apple devices.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone  wp-image-2673\" src=\"http:\/\/commons.princeton.edu\/epics\/wp-content\/uploads\/sites\/113\/2022\/07\/Screen-Shot-2022-07-22-at-10.17.11-AM.png\" alt=\"\" width=\"292\" height=\"166\" \/> <img loading=\"lazy\" decoding=\"async\" class=\"alignnone  wp-image-2674\" src=\"http:\/\/commons.princeton.edu\/epics\/wp-content\/uploads\/sites\/113\/2022\/07\/Screen-Shot-2022-07-22-at-10.17.21-AM.png\" alt=\"\" width=\"292\" height=\"164\" \/><\/p>\n<p>Left: A photo I took in VR in Tokoyoshi\u2019s \u201cTokogrammetry Gallery\u201d VRChat world<br \/>\nRight: A photo taken in VR of CAVE OKINAWA, another world by Tokoyoshi<\/p>\n<p>&nbsp;<\/p>\n<p>The first &#8211; and the most problematic &#8211; scanner I experimented with was the NextEngine scanner towards the front of the room. While the hardware ended up being in a perfectly usable state, the software ended up being extremely problematic. The company that produced the scanner went out of business years ago, and accessing the tutorials and such was only possible by using Adobe Flash Player&#8230;which was killed off years ago. After a lot of troubleshooting and reading through Reddit threads to find a solution, I eventually got the machine and the tutorials working properly and ready to scan.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone  wp-image-2675\" src=\"http:\/\/commons.princeton.edu\/epics\/wp-content\/uploads\/sites\/113\/2022\/07\/Screen-Shot-2022-07-22-at-10.18.11-AM.png\" alt=\"\" width=\"580\" height=\"483\" \/><\/p>\n<p>&nbsp;<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone  wp-image-2676\" src=\"http:\/\/commons.princeton.edu\/epics\/wp-content\/uploads\/sites\/113\/2022\/07\/Screen-Shot-2022-07-22-at-10.18.32-AM.png\" alt=\"\" width=\"579\" height=\"472\" \/><\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>However, as this was older technology, the challenges were just beginning. I spent weeks working on alignment and experimenting with the different features of the software, such as cleaning up scans and simplifying them &#8211; but most of the results ended up looking like abstract art pieces no matter what I did. A few of my favorites are below:<\/p>\n<p>&nbsp;<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone  wp-image-2677\" src=\"http:\/\/commons.princeton.edu\/epics\/wp-content\/uploads\/sites\/113\/2022\/07\/Screen-Shot-2022-07-22-at-10.19.01-AM.png\" alt=\"\" width=\"649\" height=\"367\" \/><\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone  wp-image-2678\" src=\"http:\/\/commons.princeton.edu\/epics\/wp-content\/uploads\/sites\/113\/2022\/07\/Screen-Shot-2022-07-22-at-10.19.14-AM.png\" alt=\"\" width=\"652\" height=\"387\" \/><\/p>\n<p>Top Left: A scanned marker without any alignment done. It did not fuse well.<\/p>\n<p>Top Right: A marker with markings that I used to align the scans<\/p>\n<p>Bottom: The resulting marker &#8211; it wasn\u2019t perfect but it was better than the Stonehenge-looking one<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>As I wasn\u2019t getting great results out of the NextEngine scanner, I also worked with an Xbox 360 Kinect and my iPad\u2019s LIDAR sensor. I spent some time troubleshooting the Kinect, as I couldn\u2019t get it to be recognized by any device I connected it to. I finally figured out that the power supply was faulty, and after ordering a new one, I got it to work on my PC with a piece of scanning software called Skanect and did some testing with that.<\/p>\n<p>&nbsp;<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone  wp-image-2679\" src=\"http:\/\/commons.princeton.edu\/epics\/wp-content\/uploads\/sites\/113\/2022\/07\/Screen-Shot-2022-07-22-at-10.20.08-AM.png\" alt=\"\" width=\"674\" height=\"484\" \/><\/p>\n<p>&nbsp;<\/p>\n<p>Some of the better results I got were by using my iPad, however. I tried a few apps but ended up using one simply called \u201c3D Scanner App\u201d a significant amount. Here was one early scan I did of one of the rooms at WPRB-FM. Although it\u2019s not perfect, it definitely looks like the room!<\/p>\n<p>&nbsp;<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone  wp-image-2680\" src=\"http:\/\/commons.princeton.edu\/epics\/wp-content\/uploads\/sites\/113\/2022\/07\/Screen-Shot-2022-07-22-at-10.20.55-AM.png\" alt=\"\" width=\"668\" height=\"545\" \/><\/p>\n<p>&nbsp;<\/p>\n<p>In addition to these pieces of hardware and software, I also got experience with Blender (for cleaning up 3D scans), MakerBot software, and printers (to see if I could print any of my scans &#8211; I ended up not having anything good enough to print, but I printed off some parts for my VR headset), Unity, and the VRChat SDK (I will likely be using VRChat as the host for the 3D scanning museum, so I wanted to familiarize myself with the game engine and the software development kit early on).<\/p>\n<p>This concludes this semester\u2019s research and early experimentation phase. I believe I\u2019ll try working on getting decent scans with the equipment I have over the summer, and next semester I\u2019ll start working on parts of the museum (and hopefully get some good scans out of the NextEngine scanner!)<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Research and Experimentation for a Museum in Virtual Reality Students: Carl Zielinski &#8217;24 &nbsp; Background: One of the biggest buzzwords in the past year or so has been \u201cthe Metaverse\u201d:&hellip; <a class=\"read-more\" href=\"https:\/\/commons.princeton.edu\/epics\/3d-scanning-photogrammetry-and-the-metaverse\/\" title=\"3D Scanning, Photogrammetry, and the Metaverse\"><i class=\"fa fa-arrow-right\"><\/i><\/a><\/p>\n","protected":false},"author":4319,"featured_media":2673,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"categories":[30],"tags":[],"class_list":["post-2671","page","type-page","status-publish","has-post-thumbnail","hentry","category-fall-2021"],"_links":{"self":[{"href":"https:\/\/commons.princeton.edu\/epics\/wp-json\/wp\/v2\/pages\/2671","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/commons.princeton.edu\/epics\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/commons.princeton.edu\/epics\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/commons.princeton.edu\/epics\/wp-json\/wp\/v2\/users\/4319"}],"replies":[{"embeddable":true,"href":"https:\/\/commons.princeton.edu\/epics\/wp-json\/wp\/v2\/comments?post=2671"}],"version-history":[{"count":1,"href":"https:\/\/commons.princeton.edu\/epics\/wp-json\/wp\/v2\/pages\/2671\/revisions"}],"predecessor-version":[{"id":2682,"href":"https:\/\/commons.princeton.edu\/epics\/wp-json\/wp\/v2\/pages\/2671\/revisions\/2682"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/commons.princeton.edu\/epics\/wp-json\/wp\/v2\/media\/2673"}],"wp:attachment":[{"href":"https:\/\/commons.princeton.edu\/epics\/wp-json\/wp\/v2\/media?parent=2671"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/commons.princeton.edu\/epics\/wp-json\/wp\/v2\/categories?post=2671"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/commons.princeton.edu\/epics\/wp-json\/wp\/v2\/tags?post=2671"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}