{"id":10628,"date":"2022-09-08T10:53:23","date_gmt":"2022-09-08T14:53:23","guid":{"rendered":"https:\/\/desis.osu.edu\/seniorthesis\/?p=10628"},"modified":"2022-09-08T10:53:25","modified_gmt":"2022-09-08T14:53:25","slug":"could-sketch2pose-show-the-way-to-easy-3d-modeling","status":"publish","type":"post","link":"https:\/\/desis.osu.edu\/seniorthesis\/index.php\/2022\/09\/08\/could-sketch2pose-show-the-way-to-easy-3d-modeling\/","title":{"rendered":"Could Sketch2Pose Show The Way To Easy 3D Modeling?"},"content":{"rendered":"\n<p>Sketch2Pose provides an intriguing capability that might eventually lead to a new way of 3D modeling.<\/p>\n\n\n\n<p>The tool is a kind of 2D to 3D converter. The idea is to have an artist sketch out a figure in a pose, and then Sketch2Pose uses its internal knowledge of anatomy to extrapolate the sketch into a posed 3D model.<\/p>\n\n\n\n<p>If this works consistently, it could dramatically shorten the effort required by 3D artists to convert storyboard sketches into 3D characters usable in games, films or otherwise.<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"alignright size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/desis.osu.edu\/seniorthesis\/wp-content\/uploads\/2022\/09\/sketch2.jpg\" alt=\"\" class=\"wp-image-10637\" width=\"401\" height=\"285\" srcset=\"https:\/\/desis.osu.edu\/seniorthesis\/wp-content\/uploads\/2022\/09\/sketch2.jpg 983w, https:\/\/desis.osu.edu\/seniorthesis\/wp-content\/uploads\/2022\/09\/sketch2-300x214.jpg 300w, https:\/\/desis.osu.edu\/seniorthesis\/wp-content\/uploads\/2022\/09\/sketch2-768x547.jpg 768w, https:\/\/desis.osu.edu\/seniorthesis\/wp-content\/uploads\/2022\/09\/sketch2-100x70.jpg 100w, https:\/\/desis.osu.edu\/seniorthesis\/wp-content\/uploads\/2022\/09\/sketch2-696x496.jpg 696w, https:\/\/desis.osu.edu\/seniorthesis\/wp-content\/uploads\/2022\/09\/sketch2-590x420.jpg 590w\" sizes=\"auto, (max-width: 401px) 100vw, 401px\" \/><figcaption>Generating a 3D model from a 2D drawing [Source: Sketch2Pose]<\/figcaption><\/figure>\n<\/div>\n\n\n<p>In the world of 3D printing, Sketch2Pose isn\u2019t directly useful, as it seems intended mostly for game builders. However, the process of how this is accomplished is rather interesting.<\/p>\n\n\n\n<p>The researchers have defined a solution range that encompasses the possible poses of the human body. Then, the input sketch is mapped to a specific solution that hopefully matches the drawing\u2019s intention. This only works because there is a fixed range of solutions.<\/p>\n\n\n\n<p>Now let\u2019s consider a long-term problem in 3D printing: acquiring a 3D model for printing. There are today only a couple of ways of obtaining such a 3D model:<\/p>\n\n\n\n<p>Buying or downloading one that\u2019s been designed by someone else<br>3D scanning an item and hopefully the scan is printable (often it\u2019s not)<br>Learning a CAD system and tedious building the 3D model from scratch or with a template.<\/p>\n\n\n\n<p>The latter two options are the best, yet are basically inaccessible for the vast majority of the population. This alone has presented a massive adoption barrier to 3D printing that has yet to be overcome.<\/p>\n\n\n\n<p>What most people would need and could actually use is the ability to simply sketch out what they want and have a 3D model produced.<\/p>\n\n\n\n<p>No such system exists.<\/p>\n\n\n\n<p>But then, Sketch2Pose seems to do something very much like that. Could it or an adaption be used to generate 3D printable models?<\/p>\n\n\n\n<p>I think this is actually possible, if the solution range is limited, just as is done with Sketch2Pose. Imagine something called \u201cSketch2Bolt\u201d, which would accept a sketch of a bolt and then attempt to design a 3D model of a corresponding bolt.<\/p>\n\n\n\n<p>The sketcher could draw the shape of the bolt head, ratio of length, perhaps even the dimensions, and Sketch2Bolt could generate the bolt 3D model in the same way Sketch2Pose generates a human pose 3D model.<\/p>\n\n\n\n<p>Sketch2Bolt would make bolt 3D models, but not nuts. For that you\u2019d need \u201cSketch2Nut\u201d.<\/p>\n\n\n\n<p>Then you can imagine a series of different \u201cSketch2\u201d systems, each focusing on a particular type of mechanical part. If gathered together on a website, the user would simply pick the kind of object required and then provide a sketch. Custom 3D models could be quickly produced on demand for these parts, and then printed.<\/p>\n\n\n\n<p>If such a system existed it could revolutionize consumer use of 3D printers. Today there are plenty of consumers using 3D printers, but most of them are simply producing more plastic dragons. What if they were suddenly energized to \u201ccreate\u201d their own parts whenever they needed by a simple sketch?<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Reflexive Analysis<\/h2>\n\n\n\n<p>A research project from l&#8217;Universit\u00e9 de Montr\u00e9al, this emerging technology can revolutionize many aspect of art synthesis and personal study. Sketch2Pose is an opening door to much larger and similar technologies. Slowly, less and less human intervention and skill is needed to achieve the same things that previously required an operator (3D modeling, CAD, etc.), pushing the art\/design world to incorporate more multimedia solutions. This advancement is similar to the rise of AI generated art we have come across in recent social spaces. Could the future of art be highly automated and computationally generated?<\/p>\n\n\n\n<p>Can a computer create art? Is a human using AI\/technology to create something an artist, a computer technician, both, or something else entirely? Where is the line drawn separating &#8220;real&#8221; art from generated art? Does such a line exist? Is generated art somehow less valuable than real art? How can generated art mesh into human-created art? Does the creator of an art-making AI receive credit for other people using their tool? Does a painter credit the company that manufactures their paint?<\/p>\n\n\n\n<figure class=\"wp-block-embed alignleft is-type-wp-embed is-provider-fabbaloo wp-block-embed-fabbaloo\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"wp-embedded-content\" data-secret=\"MvRfysU2Lv\"><a href=\"https:\/\/www.fabbaloo.com\/news\/could-sketch2pose-show-the-way-to-easy-3d-modeling\">Could Sketch2Pose Show The Way To Easy 3D Modeling?<\/a><\/blockquote><iframe loading=\"lazy\" class=\"wp-embedded-content\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; clip: rect(1px, 1px, 1px, 1px);\" title=\"&#8220;Could Sketch2Pose Show The Way To Easy 3D Modeling?&#8221; &#8212; Fabbaloo\" src=\"https:\/\/www.fabbaloo.com\/news\/could-sketch2pose-show-the-way-to-easy-3d-modeling\/embed#?secret=OMi3ZjmV9Q#?secret=MvRfysU2Lv\" data-secret=\"MvRfysU2Lv\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\"><\/iframe>\n<\/div><\/figure>\n\n\n\n<pre class=\"wp-block-preformatted\"><em>Stevenson, Kerry. \"Could Sketch2Pose Show The Way To Easy 3D Modeling?\" Fabbaloo, Terran Data Corporation, 29 Aug. 2022, www.fabbaloo.com\/news\/could-sketch2pose-show-the-way-to-easy-3d-modeling. Accessed 8 Sept. 2022. <\/em><\/pre>\n","protected":false},"excerpt":{"rendered":"<p>Sketch2Pose provides an intriguing capability that might eventually lead to a new way of 3D modeling. The tool is a kind of 2D to 3D converter. The idea is to have an artist sketch out a figure in a pose, and then Sketch2Pose uses its internal knowledge of anatomy to extrapolate the sketch into a [&hellip;]<\/p>\n","protected":false},"author":77,"featured_media":10635,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[30,2],"tags":[287,341,338,98,342,343,340,339,336,335,337],"class_list":{"0":"post-10628","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-arts-featured","8":"category-featured","9":"tag-3d","10":"tag-3d-model","11":"tag-3d-modeling","12":"tag-art","13":"tag-cad","14":"tag-drawing","15":"tag-model","16":"tag-modeling","17":"tag-pose","18":"tag-sketch","19":"tag-sketch2pose"},"_links":{"self":[{"href":"https:\/\/desis.osu.edu\/seniorthesis\/index.php\/wp-json\/wp\/v2\/posts\/10628","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/desis.osu.edu\/seniorthesis\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/desis.osu.edu\/seniorthesis\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/desis.osu.edu\/seniorthesis\/index.php\/wp-json\/wp\/v2\/users\/77"}],"replies":[{"embeddable":true,"href":"https:\/\/desis.osu.edu\/seniorthesis\/index.php\/wp-json\/wp\/v2\/comments?post=10628"}],"version-history":[{"count":7,"href":"https:\/\/desis.osu.edu\/seniorthesis\/index.php\/wp-json\/wp\/v2\/posts\/10628\/revisions"}],"predecessor-version":[{"id":10658,"href":"https:\/\/desis.osu.edu\/seniorthesis\/index.php\/wp-json\/wp\/v2\/posts\/10628\/revisions\/10658"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/desis.osu.edu\/seniorthesis\/index.php\/wp-json\/wp\/v2\/media\/10635"}],"wp:attachment":[{"href":"https:\/\/desis.osu.edu\/seniorthesis\/index.php\/wp-json\/wp\/v2\/media?parent=10628"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/desis.osu.edu\/seniorthesis\/index.php\/wp-json\/wp\/v2\/categories?post=10628"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/desis.osu.edu\/seniorthesis\/index.php\/wp-json\/wp\/v2\/tags?post=10628"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}