Please use this identifier to cite or link to this item:
http://arks.princeton.edu/ark:/88435/dsp01m613n1549
Title: | Using Computer Vision to Model Fashion Outfit Compatibility Economics_Senior_Thesis_Submission_Click_Here_To_Submit_morokoff_attempt_2016-04-12-01-01-20_MerkinMorokoff_Seth.pdf ORIGINAL Using Computer Vision to Model Fashion Outfit Compatibility Using Computer Vision to Model Fashion Outfit Compatibility |
Authors: | Zeng, Andrew |
Advisors: | Russakovsky, Olga |
Department: | Computer Science |
Class Year: | 2020 |
Abstract: | Fashion retailers generate large amounts of clothing related data such as clothing item images, clothing item metadata, and outfits. Computer vision can aid in the task of building outfits by creating a model that learns both similarity between clothing items that are interchangeable and compatibilitybetween clothing items of different type that go well together in an outfit. To achieve this, a model needs to compare images across various similarity conditions such as color, shape, and category. A recent state-of-the-art method named Similarity Condition Embedding Network (SCE-Net) learns multiple similarity conditions without explicit supervision from a unified embedding space that produce image embeddings thatcan be used to score outfits. In this paper, we examine the performance of this network on outfit compatibility and fill-in-the-blank tasks for an online clothing retail dataset from H&M to better understand how the network learns concepts of similarity and compatibility in the fashion domain. To further explore its performance we also create a messaging app that acts as a virtual stylist by using the trained model. |
URI: | http://arks.princeton.edu/ark:/88435/dsp01m613n1549 |
Type of Material: | Princeton University Senior Theses |
Language: | en |
Appears in Collections: | Computer Science, 1988-2020 |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
ZENG-ANDREW-THESIS.pdf | 2.17 MB | Adobe PDF | Request a copy |
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.