Few-Shot Text Style Transfer via Deep Feature Similarity

Anna Zhu, Xiongbo Lu, Xiang Bai, Seiichi Uchida, Brian Kenji Iwana, Shengwu Xiong

研究成果: Contribution to journalArticle査読


Generating text to have a consistent style with only a few observed highly-stylized text samples is a difficult task for image processing. The text style involving the typography, i.e., font, stroke, color, decoration, effects, etc., should be considered for transfer. In this paper, we propose a novel approach to stylize target text by decoding weighted deep features from only a few referenced samples. The deep features, including content and style features of each referenced text, are extracted from a Convolutional Neural Network (CNN) that is optimized for character recognition. Then, we calculate the similarity scores of the target text and the referenced samples by measuring the distance along the corresponding channels from the content features of the CNN when considering only the content, and assign them as the weights for aggregating the deep features. To enforce the stylized text to be realistic, a discriminative network with adversarial loss is employed. We demonstrate the effectiveness of our network by conducting experiments on three different datasets which have various styles, fonts, languages, etc. Additionally, the coefficients for character style transfer, including the character content, the effect of similarity matrix, the number of referenced characters, the similarity between characters, and performance evaluation by a new protocol are analyzed for better understanding our proposed framework.

ジャーナルIEEE Transactions on Image Processing
出版ステータス出版済み - 2020

All Science Journal Classification (ASJC) codes

  • ソフトウェア
  • コンピュータ グラフィックスおよびコンピュータ支援設計


「Few-Shot Text Style Transfer via Deep Feature Similarity」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。