基于神經(jīng)網(wǎng)絡(luò)的圖像風(fēng)格遷移研究綜述
2022年電子技術(shù)應(yīng)用第6期
劉建鋒,鐘國(guó)韻
東華理工大學(xué) 信息工程學(xué)院,江西 南昌330013
摘要: 為了推動(dòng)基于神經(jīng)網(wǎng)絡(luò)的圖像風(fēng)格遷移技術(shù)的研究,對(duì)基于神經(jīng)網(wǎng)絡(luò)的圖像風(fēng)格遷移的主要方法和代表性工作進(jìn)行了總結(jié)和討論?;仡櫫藗鹘y(tǒng)的風(fēng)格遷移算法,詳細(xì)介紹了基于神經(jīng)網(wǎng)絡(luò)的主要圖像風(fēng)格遷移的基本原理和方法,分析了相關(guān)基于神經(jīng)網(wǎng)絡(luò)的圖像風(fēng)格遷移領(lǐng)域的應(yīng)用前景,最后總結(jié)了基于神經(jīng)網(wǎng)絡(luò)的圖像風(fēng)格遷移存在的挑戰(zhàn)和未來研究方向。
中圖分類號(hào): TP183
文獻(xiàn)標(biāo)識(shí)碼: A
DOI:10.16157/j.issn.0258-7998.222706
中文引用格式: 劉建鋒,鐘國(guó)韻. 基于神經(jīng)網(wǎng)絡(luò)的圖像風(fēng)格遷移研究綜述[J].電子技術(shù)應(yīng)用,2022,48(6):14-18.
英文引用格式: Liu Jianfeng,Zhong Guoyun. Survey of image style transfer based on neural network[J]. Application of Electronic Technique,2022,48(6):14-18.
文獻(xiàn)標(biāo)識(shí)碼: A
DOI:10.16157/j.issn.0258-7998.222706
中文引用格式: 劉建鋒,鐘國(guó)韻. 基于神經(jīng)網(wǎng)絡(luò)的圖像風(fēng)格遷移研究綜述[J].電子技術(shù)應(yīng)用,2022,48(6):14-18.
英文引用格式: Liu Jianfeng,Zhong Guoyun. Survey of image style transfer based on neural network[J]. Application of Electronic Technique,2022,48(6):14-18.
Survey of image style transfer based on neural network
Liu Jianfeng,Zhong Guoyun
School of Information Engineering,East China University of Technology,Nanchang 330013,China
Abstract: In order to promote the research of image style transfer technology based on neural network, the main methods and representative work of image style transfer based on neural network are summarized and discussed in this paper. This paper reviews the traditional style transfer algorithms, introduces the basic principles and methods of the main image style transfer based on neural network in detail, analyzes the application prospect of the related field of image style transfer based on neural network, and finally summarizes the existing problems and future research direction of the image style transfer based on neural network.
Key words : neural network;image style transfer;texture synthesis
0 引言
傳統(tǒng)的圖像風(fēng)格遷移方法通常被作為紋理合成的一個(gè)廣義問題來處理和研究,即從風(fēng)格圖S(Style image)中采集紋理并將其遷移到內(nèi)容圖C(Content image)中。Efros等人提出了一種將樣本紋理進(jìn)行拼接和重組的簡(jiǎn)單紋理合成算法[1];基于類推思想,Hertzmann等人通過圖像特征映射關(guān)系合成了帶有新紋理的圖像[2]。傳統(tǒng)圖像風(fēng)格遷移方法只是提取圖像的低層次的圖像特征,而非高層次的圖像語(yǔ)義信息,在對(duì)色彩與紋理較為繁雜的圖像進(jìn)行風(fēng)格化時(shí),合成效果圖會(huì)十分不理想,很難在實(shí)際應(yīng)用場(chǎng)景中使用。
基于以上討論,雖然傳統(tǒng)的圖像風(fēng)格遷移算法能夠誠(chéng)實(shí)地描繪出某些特定的圖像風(fēng)格,然而它們存在一定的局限性,靈活性不足、風(fēng)格不夠多樣化且圖像結(jié)構(gòu)提取困難。因而,需要全新的算法,來解除這些限制。于是便出現(xiàn)了神經(jīng)網(wǎng)絡(luò)圖像風(fēng)格遷移領(lǐng)域。
本文詳細(xì)內(nèi)容請(qǐng)下載:http://theprogrammingfactory.com/resource/share/2000004411。
作者信息:
劉建鋒,鐘國(guó)韻
(東華理工大學(xué) 信息工程學(xué)院,江西 南昌330013)
此內(nèi)容為AET網(wǎng)站原創(chuàng),未經(jīng)授權(quán)禁止轉(zhuǎn)載。