Automatic Image transformation for Inducing Affect

Abstract

To appear in BMVC 2017
To appear in BMVC 2017

Current image transformation and recoloring algorithms try to introduce artistic effects in the photographed images, based on user input of target image(s) or selection of pre-designed filters. These manipulations, although intended to enhance the impact of an image on the viewer, do not include the option of image transformation by specifying the affect information. In this paper we present an automatic image-transformation method that transforms the source image such that it can induce an emotional affect on the viewer, as desired by the user. Our proposed novel image emotion transfer algorithm does not require a user-specified target image. The proposed algorithm uses features extracted from top layers of deep convolutional neural network and the user-specified emotion distribution to select multiple target images from an image database for color transformation, such that the resultant image has desired emotional impact. Our method can handle more diverse set of photographs than the previous methods. We conducted a detailed user study showing the effectiveness of our proposed method. A discussion and reasoning of failure cases has also been provided, indicating inherent limitation of color-transfer based methods in the use of emotion assignment.

Authors

Afsheen Rafaqat Ali, Mohsen Ali

Bibtex

@inproceedings{affectColor2017,

title={Automatic Image transformation for Inducing Affect},

author={Afsheen Rafaqat Ali and Mohsen Ali},

booktitle={British Machine Vision Conference},

year={2017}}

Contribution

In this paper, we present a framework for image transformation such that the transformed source image has the affect, as desired by the user. Unlike previous methods, in our algorithm user has to input only the source image and seven dimensional discrete probability distribution representing ratio of seven emotions (anger, disgust, fear, joy, sadness, surprise and neutral). We use features extracted from top layers of deep convolutional networks (that encapsulate both in-image-context and content information) to select target images that have similar content and spatial context as the source image. In addition to that, we also make sure that emotion distribution of selected target images matches with the desired emotion distribution.

1
Block diagram of proposed Affect assignment algorithm.

Qualitative Results

We apply our algorithm on images in Emotion6 dataset and few of the famous images. Following figures demonstrate our emotion transfer results.

Emotion transformation on some popular photographs. In the first row, we have shown to increase anger with bit of sadness and fear, whereas, in the second image we have tried to increase the element of fear with a bit of sadness.
Emotion transformation on some popular photographs. In the first row, we have shown to increase anger with bit of sadness and fear, whereas, in the second image we have tried to increase the element of fear with a bit of sadness.
Given source image and target emotion distribution, our system recolors the image by automatically selecting target images (from Emotion6 dataset) that are semantically similar to source image and their emotion distribution matches provided target emotion distribution.
Given source image and target emotion distribution, our system recolors the image by automatically selecting target images (from Emotion6 dataset) that are semantically similar to source image and their emotion distribution matches provided target emotion distribution.
Emotion transformation on some images from ArtPhoto dataset. First row: change of affect class. Second row: enhancing the arousal value of induced emotion.
Emotion transformation on some images from ArtPhoto dataset. First row: change of affect class. Second row: enhancing the arousal value of induced emotion.