%PDF-1.3 1 0 obj << /Kids [ 4 0 R 5 0 R 6 0 R 7 0 R 8 0 R 9 0 R 10 0 R 11 0 R 12 0 R ] /Type /Pages /Count 9 >> endobj 2 0 obj << /Subject (Neural Information Processing Systems http\072\057\057nips\056cc\057) /Publisher (Curran Associates\054 Inc\056) /Language (en\055US) /Created (2014) /EventType (Spotlight) /Description-Abstract (Applying convolutional neural networks to large images is computationally expensive because the amount of computation scales linearly with the number of image pixels\056 We present a novel recurrent neural network model that is capable of extracting information from an image or video by adaptively selecting a sequence of regions or locations and only processing the selected regions at high resolution\056 Like convolutional neural networks\054 the proposed model has a degree of translation invariance built\055in\054 but the amount of computation it performs can be controlled independently of the input image size\056 While the model is non\055differentiable\054 it can be trained using reinforcement learning methods to learn task\055specific policies\056 We evaluate our model on several image classification tasks\054 where it significantly outperforms a convolutional neural network baseline on cluttered images\054 and on a dynamic visual control problem\054 where it learns to track a simple object without an explicit training signal for doing so\056) /Producer (PyPDF2) /Title (Recurrent Models of Visual Attention) /Date (2014) /ModDate (D\07220141202160111\05508\04700\047) /Published (2014) /Type (Conference Proceedings) /firstpage (2204) /Book (Advances in Neural Information Processing Systems 27) /Description (Paper accepted and presented at the Neural Information Processing Systems Conference \050http\072\057\057nips\056cc\057\051) /Editors (Z\056 Ghahramani and M\056 Welling and C\056 Cortes and N\056D\056 Lawrence and K\056Q\056 Weinberger) /Author (Volodymyr Mnih\054 Nicolas Heess\054 Alex Graves\054 koray kavukcuoglu) /lastpage (2212) >> endobj 3 0 obj << /Type /Catalog /Pages 1 0 R >> endobj 4 0 obj << /Contents 13 0 R /Parent 1 0 R /Type /Page /Resources 14 0 R /MediaBox [ 0 0 612 792 ] >> endobj 5 0 obj << /Contents 32 0 R /Parent 1 0 R /Type /Page /Resources 33 0 R /MediaBox [ 0 0 612 792 ] >> endobj 6 0 obj << /Contents 38 0 R /Parent 1 0 R /Resources 39 0 R /Group 78 0 R /MediaBox [ 0 0 612 792 ] /Type /Page >> endobj 7 0 obj << /Contents 104 0 R /Parent 1 0 R /Type /Page /Resources 105 0 R /MediaBox [ 0 0 612 792 ] >> endobj 8 0 obj << /Contents 146 0 R /Parent 1 0 R /Type /Page /Resources 147 0 R /MediaBox [ 0 0 612 792 ] >> endobj 9 0 obj << /Contents 152 0 R /Parent 1 0 R /Type /Page /Resources 153 0 R /MediaBox [ 0 0 612 792 ] >> endobj 10 0 obj << /Contents 156 0 R /Parent 1 0 R /Resources 157 0 R /Group 162 0 R /MediaBox [ 0 0 612 792 ] /Type /Page >> endobj 11 0 obj << /Contents 163 0 R /Parent 1 0 R /Type /Page /Resources 164 0 R /MediaBox [ 0 0 612 792 ] >> endobj 12 0 obj << /Contents 165 0 R /Parent 1 0 R /Type /Page /Resources 166 0 R /MediaBox [ 0 0 612 792 ] >> endobj 13 0 obj << /Length 3187 /Filter /FlateDecode >> stream xuɖ_#^K"n/_ұ=sHHB(_@QjB(ԎbC0ȽwE{eȸHxW7hχCl{:za0ҴE=U/]Nð s_f%KW2&|c'U8租.wqF= &~]\YC^{l+(%^./ ]d0 ꮺ4Amy djʮVߴ&bC16Q|meψuadz)TOxA`.]24!()kZJS6vQQx[آs^{lAߥ ǀ 6p3^4ڪe0.vy*y٦ ]`Wv, $"v"] vT9 wz'-UDCŴ6`N '4/8 VO_QbyLȒOѦQ'myXwFOOM߶xFnI|8.de0PM7!~q'kj50z5fo标y芛>L|#kl Oڒ#O!=߫f@`&tRgx:#g%+Px^֥$>f eOvĆ=vCD̒ %8RwJw.B]~\d K5$'32C@3:uWÖ?Ϸth:J`e$FW5.沏`o'8~ϫ`j*^&0%a$G1"NXMd!N0WQ,,dqkZie<[SMV 3H8`X/H`_1 QǠ2f!!`%WZ? DZfSеQdGkеPqY7M"R4As*?hrR78[ChjgF箒!.9,F}^SD%#6bPN8CQFvpMV{_CZSN@]B9`劕{"hQ^g%l,Qz:&i#rR8WcxJqAA ŬG7E%7{hym7:bşҪƔ<ʷ