Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp012z10wq379
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorFinkelstein, Adamen_US
dc.contributor.authorLu, Jingwanen_US
dc.contributor.otherComputer Science Departmenten_US
dc.date.accessioned2014-06-05T19:45:56Z-
dc.date.available2014-06-05T19:45:56Z-
dc.date.issued2014en_US
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/dsp012z10wq379-
dc.description.abstractDigital artists create evocative drawings and paintings using a tablet and stylus coupled with digital painting software. Research systems have shown promising improvements in various aspects of the art creation process by targeting specific drawing styles and natural media, for example oil paint or watercolor. They combine carefully hand-crafted procedural rules and computationally expensive, style-specific physical simulations. Nevertheless, untrained users often find it hard to achieve their target style in these systems due to the challenge of controlling and predicting the outcome of their collective drawing strokes. Moreover even trained digital artists are often restricted by the inherent stylistic limitations of these systems. In this thesis, we propose a data-driven painting paradigm that allows novices and experts to more easily create visually compelling artworks using exemplars. To make data-driven painting feasible and efficient, we factorize the painting process into a set of orthogonal components: 1) stroke paths; 2) hand gestures; 3) stroke textures; 4) inter-stroke interactions; 5) pigment colors. We present four prototype systems, HelpingHand, RealBrush, DecoBrush and RealPigment, to demonstrate that each component can be synthesized efficiently and independently based on small sets of decoupled exemplars. We propose efficient algorithms to acquire and process visual exemplars and a general framework for data-driven stroke synthesis based on feature matching and optimization. With the convenience of data sharing on the Internet, this data-driven paradigm opens up new opportunities for artists and amateurs to create original stylistic artwork and to abstract, share and reproduce their styles more easily and faithfully.en_US
dc.language.isoenen_US
dc.publisherPrinceton, NJ : Princeton Universityen_US
dc.relation.isformatofThe Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the <a href=http://catalog.princeton.edu> library's main catalog </a>en_US
dc.subjectData-drivenen_US
dc.subjectDesignen_US
dc.subjectDrawingen_US
dc.subjectNon-photorealistic renderingen_US
dc.subjectPaintingen_US
dc.subjectStroke-based renderingen_US
dc.subject.classificationComputer scienceen_US
dc.subject.classificationComputer engineeringen_US
dc.titleData-driven Digital Drawing and Paintingen_US
dc.typeAcademic dissertations (Ph.D.)en_US
pu.projectgrantnumber690-2143en_US
Appears in Collections:Computer Science

Files in This Item:
File Description SizeFormat 
Lu_princeton_0181D_10938.pdf106.05 MBAdobe PDFView/Download


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.