{"id":391,"date":"2022-04-01T12:36:50","date_gmt":"2022-04-01T12:36:50","guid":{"rendered":"https:\/\/blog.liguanxin.cn\/?p=391"},"modified":"2022-04-01T12:37:44","modified_gmt":"2022-04-01T12:37:44","slug":"%e8%ae%ba%e6%96%87%e7%ac%94%e8%ae%b0-tvconv-efficient-translation-variant-convolution-for-layout-aware-visual-processing","status":"publish","type":"post","link":"https:\/\/blog.liguanxin.cn\/index.php\/2022\/04\/01\/%e8%ae%ba%e6%96%87%e7%ac%94%e8%ae%b0-tvconv-efficient-translation-variant-convolution-for-layout-aware-visual-processing\/","title":{"rendered":"\u8bba\u6587\u7b14\u8bb0\u2014\u2014TVConv: Efficient Translation Variant Convolution for Layout-aware Visual Processing"},"content":{"rendered":"<p>\uff08CVPR2022\u8bba\u6587\uff09<\/p>\n<p><strong>\u4ee5\u524d\u5b58\u5728\u7684\u95ee\u9898\uff1a\u9759\u6001\u548c\u52a8\u6001\u5377\u79ef\u8981\u4e48\u4e0e\u5e03\u5c40\u65e0\u5173\uff0c\u8981\u4e48\u8ba1\u7b97\u91cf\u5927\uff0c\u4e0d\u9002\u7528\u4e8e\u7279\u5b9a\u4e8e\u5e03\u5c40\u7684\u5e94\u7528\u7a0b\u5e8f\uff0c\u4f8b\u5982\u4eba\u8138\u8bc6\u522b\u548c\u533b\u5b66\u56fe\u50cf\u5206\u5272\u3002<\/strong><\/p>\n<p><strong>\u521b\u65b0\u70b9\uff1a<br \/>\n\u2460\u4f5c\u8005\u89c2\u5bdf\u5230\u4eba\u8138\u8bc6\u522b\u7684\u8fd9\u7c7b\u56fe\u50cf\uff0c\u56fe\u50cf\u5185\u65b9\u5dee\u5927\uff0c\u56fe\u50cf\u95f4\u65b9\u5dee\u5c0f\u7684\u7279\u70b9\uff0c\u63d0\u51fa\u4e86TVConv\u7528\u4e8e\u5e03\u5c40\u611f\u77e5\u7684\u89c6\u89c9\u5904\u7406<br \/>\n\u2461TVConv\u7531\u4eb2\u548c\u529b\u56fe(affinity maps)\u548c\u6743\u91cd\u751f\u6210\u5757\u7ec4\u6210\uff0c\u4eb2\u548c\u529b\u56fe\u63cf\u7ed8\u4e86\u50cf\u7d20\u914d\u5bf9\u5173\u7cfb\uff0c\u6743\u91cd\u751f\u6210\u5757\u53ef\u4ee5\u907f\u514d\u8fc7\u5ea6\u53c2\u6570\u5316<br \/>\n\u2462\u4e0e\u6df1\u5ea6\u5377\u79ef\u76f8\u6bd4\uff0cTV Conv \u5c06\u8ba1\u7b97\u6210\u672c\u964d\u4f4e\u4e86 3.1 \u500d\uff0c\u5e76\u5c06\u76f8\u5e94\u7684\u541e\u5410\u91cf\u63d0\u9ad8\u4e86 2.3 \u500d\uff0c\u540c\u65f6\u4fdd\u6301\u4e86\u8f83\u9ad8\u7684\u7cbe\u5ea6\u3002<\/strong><\/p>\n<h1>\u4e0e\u5176\u4ed6\u5377\u79ef\u7684\u6bd4\u8f83<\/h1>\n<p>\u9996\u5148\u6307\u5b9a\u53ef\u5b66\u4e60\u7684\u4eb2\u548c\u529b\u56fe\u6765\u533a\u5206\u5c40\u90e8\u7279\u5f81\uff0c\u4eb2\u548c\u529b\u56fe\u9690\u5f0f\u5730\u6355\u83b7\u4e0d\u540c\u533a\u57df\u7684\u8bed\u4e49\u5173\u7cfb\uff08\u53ef\u770b\u4f5c\u4e00\u79cd\u6ce8\u610f\u529b\u673a\u5236\uff09\u3002\u7136\u540e\u5c06\u4eb2\u548c\u529b\u56fe\u5582\u5165\u6743\u91cd\u751f\u6210\u5757\u6765\u4ea7\u751f\u6743\u91cd\uff0c\u4f5c\u4e3a\u4e00\u79cdfilter\u5e94\u7528\u4e8e\u8f93\u5165\u3002\u4eb2\u548c\u529b\u56fe\u8bad\u7ec3\u5b8c\u4e4b\u540e\u8fd8\u53ef\u4ee5\u8c03\u6574\u4ee5\u9002\u5e94\u4e0d\u540c\u7a7a\u95f4\u7279\u5f81\u3002<br \/>\n<img src=\"https:\/\/blog.liguanxin.cn\/wp-content\/uploads\/2022\/04\/\u5fae\u4fe1\u622a\u56fe_20220401193521.png\" alt=\"\" \/><br \/>\n\u5de6\u8fb9\u7b2c\u4e00\u4e2a\u56fe\u4ee3\u8868\u4e0d\u540c\u533a\u57df\u7684\u5377\u79ef\u6743\u91cd\u662f\u53ef\u53d8\u7684\uff0c\u5de6\u8fb9\u7b2c\u4e8c\u4e2a\u56fe\u4ee3\u8868\u666e\u901a\u7684\u5377\u79ef\u4e0d\u540c\u533a\u57df\u6743\u91cd\u76f8\u540c\u3002<br \/>\n\u53f3\u4e0a\u89d2\u662f\u4e00\u4e2a\u8868\uff0cx\u8f74\u4ee3\u8868\u8de8\u56fe\u50cf\u65b9\u5dee\uff0cy\u8f74\u4ee3\u8868\u56fe\u50cf\u5185\u65b9\u5dee\u3002<\/p>\n<h1>\u6574\u4f53\u7ed3\u6784<\/h1>\n<p>\u4f5c\u8005\u60f3\u8981\u505a\u4e00\u4e2a\u5168\u5c40\u5377\u79efW\uff0c\u4f46\u662f\u5e26\u6765\u7684\u95ee\u9898\u5c31\u662f\u53c2\u6570\u91cf\u5de8\u5927\u800c\u4e14\u5bb9\u6613\u8fc7\u62df\u5408\uff0c\u53c2\u6570\u91cf\u662f<span class=\"katex-eq\" data-katex-display=\"false\">c\u00d7k\u00d7k\u00d7h\u00d7w<\/span>\uff0ck\u4e3a\u6838\u7684\u5927\u5c0f\u3002\u4e8e\u662f\u628aW\u5206\u89e3\u4e3aW&#8217;=B&#8217;A&#8217;\uff0c\u5176\u4e2dW&#8217;\u662fW\u7684reshape\u7248\u672c\uff0c\u5c3a\u5bf8\u4e3a<span class=\"katex-eq\" data-katex-display=\"false\">(c\u00d7k\u00d7k)\u00d7(h\u00d7w)<\/span>\uff0c\u7136\u540eB&#8217;\u662fbasis\u77e9\u9635\u5c3a\u5bf8\u4e3a<span class=\"katex-eq\" data-katex-display=\"false\">(c\u00d7k\u00d7k)\u00d7c_A<\/span>\uff0c\u7136\u540eA&#8217;\u662f\u7cfb\u6570\u77e9\u9635\u5c3a\u5bf8\u4e3a<span class=\"katex-eq\" data-katex-display=\"false\">c_A\u00d7(h\u00d7w)<\/span>\u3002<br \/>\n\u7ecf\u8fc7\u8fd9\u4e00\u64cd\u4f5c\uff0c\u53c2\u6570\u91cf\u5c31\u53ef\u4ee5\u4ece<span class=\"katex-eq\" data-katex-display=\"false\"> (ckkhw)<\/span>\u51cf\u5c11\u5230<span class=\"katex-eq\" data-katex-display=\"false\"> (ckkc_A + c_Ahw)<\/span>\uff0c\u4e00\u822c\u53d6<span class=\"katex-eq\" data-katex-display=\"false\">c_A=1<\/span><br \/>\n<img src=\"https:\/\/blog.liguanxin.cn\/wp-content\/uploads\/2022\/04\/\u5fae\u4fe1\u622a\u56fe_20220401195412.png\" alt=\"\" \/><br \/>\nA\u4ee3\u8868\u4eb2\u548c\u529b\u56fe\uff0cB\u4ee3\u8868\u6743\u91cd\u751f\u6210\u5757<\/p>\n<h1>CODE<\/h1>\n<p><img src=\"https:\/\/blog.liguanxin.cn\/wp-content\/uploads\/2022\/04\/\u5fae\u4fe1\u622a\u56fe_20220401202617.png\" alt=\"\" \/><\/p>\n<pre><code class=\"language-python\">class TVConv(nn.Module):\n    def __init__(self,\n                 channels,\n                 TVConv_k=3,\n                 stride=1,\n                 TVConv_posi_chans=4,\n                 TVConv_inter_chans=16,\n                 TVConv_inter_layers=1,\n                 TVConv_Bias=False,\n                 h=3,\n                 w=3,\n                 **kwargs):\n        super(TVConv, self).__init__()\n        # \u4ee5\u4e0b\u53c2\u6570\u5b9a\u4e49\u4e3a\u5e38\u91cf\uff0c\u5f53\u6a21\u578b\u4fdd\u5b58\u7684\u65f6\u5019\u4e5f\u4fdd\u5b58\n        self.register_buffer(&quot;TVConv_k&quot;, torch.as_tensor(TVConv_k))\n        self.register_buffer(&quot;TVConv_k_square&quot;, torch.as_tensor(TVConv_k**2))\n        self.register_buffer(&quot;stride&quot;, torch.as_tensor(stride))\n        self.register_buffer(&quot;channels&quot;, torch.as_tensor(channels))\n        self.register_buffer(&quot;h&quot;, torch.as_tensor(h))\n        self.register_buffer(&quot;w&quot;, torch.as_tensor(w))\n\n        self.bias_layers = None\n        # \u8f93\u51fachannel\uff0c\u4ee3\u8868\u56fe\u4e2d\u7684k^2c\n        out_chans = self.TVConv_k_square * self.channels\n        # \u4eb2\u548c\u529b\u56fe\u521d\u59cb\u5316\uff0c\u9ed8\u8ba4\u5927\u5c0f\u4e3a4*h*w\n        self.posi_map = nn.Parameter(torch.Tensor(1, TVConv_posi_chans, h, w))\n        # \u4eb2\u548c\u529b\u56fe\u521d\u59cb\u5316\u503c\u4e3a1\n        nn.init.ones_(self.posi_map)\n        # \u6743\u91cd\u751f\u6210\u5668\uff0c\u5982\u4e0a\u56fe\u6240\u793a\uff0c\u7531\u591a\u4e2a\uff08\u9ed8\u8ba4\u662f\u4e00\u4e2a\uff093*3conv\u3001layernorm\u3001relu\u4e09\u8fde\u6784\u6210\uff0c\u6700\u540e\u5e26\u4e0a\u4e00\u4e2a3*3conv\u6536\u5c3e\n        self.weight_layers = self._make_layers(TVConv_posi_chans, TVConv_inter_chans, out_chans, TVConv_inter_layers, h, w)\n        # \u6743\u91cdbias\u5c42\uff0c\u901a\u8fc7\u6743\u91cd\u751f\u6210\u5668\u5bf9\u4eb2\u548c\u56fe\u5377\uff0c\u6765\u4ea7\u751f\u4e00\u4e2a\u8f93\u51fachannel\u8ddf\u8f93\u5165channel\u4e00\u6837\u7684\u6743\u91cd\u6765\u505a\u76f8\u52a0\n        if TVConv_Bias:\n            self.bias_layers = self._make_layers(TVConv_posi_chans, TVConv_inter_chans, channels, TVConv_inter_layers, h, w)\n\n        self.unfold = nn.Unfold(TVConv_k, 1, (TVConv_k-1)\/\/2, stride) # \u53ea\u5377\u4e0d\u79ef\n\n    def _make_layers(self, in_chans, inter_chans, out_chans, num_inter_layers, h, w):\n        ...\n\n    def forward(self, x):\n        # \u6743\u91cd\u751f\u6210\u5668\u5bf9\u4eb2\u548c\u529b\u56fe\u5377\n        weight = self.weight_layers(self.posi_map)\n        # reshape\n        weight = weight.view(1, self.channels, self.TVConv_k_square, self.h, self.w)\n        # \u628a\u8f93\u5165reshape\u6210\u8ddf\u4e0a\u9762\u6743\u91cd\u7684\u5f62\u72b6\u4e00\u6837\n        out = self.unfold(x).view(x.shape[0], self.channels, self.TVConv_k_square, self.h, self.w)\n        # \u76f8\u4e58\uff0c\u5e76\u5bf9\u9700\u8981\u5377\u7684k*k\u65b9\u683c\u7d2f\u52a0\n        out = (weight * out).sum(dim=2)\n        # \u52a0\u4e0a\u6743\u91cdbias\u5c42\n        if self.bias_layers is not None:\n            bias = self.bias_layers(self.posi_map)\n            out = out + bias\n\n        return out<\/code><\/pre>\n","protected":false},"excerpt":{"rendered":"<p>\uff08CVPR2022\u8bba\u6587\uff09 \u4ee5\u524d\u5b58\u5728\u7684\u95ee\u9898\uff1a\u9759\u6001\u548c\u52a8\u6001\u5377\u79ef\u8981\u4e48\u4e0e\u5e03\u5c40\u65e0\u5173\uff0c\u8981\u4e48\u8ba1\u7b97\u91cf\u5927\uff0c\u4e0d\u9002\u7528\u4e8e\u7279\u5b9a\u4e8e\u5e03\u5c40\u7684\u5e94\u7528 [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":396,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[6],"tags":[14,11],"_links":{"self":[{"href":"https:\/\/blog.liguanxin.cn\/index.php\/wp-json\/wp\/v2\/posts\/391"}],"collection":[{"href":"https:\/\/blog.liguanxin.cn\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blog.liguanxin.cn\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blog.liguanxin.cn\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/blog.liguanxin.cn\/index.php\/wp-json\/wp\/v2\/comments?post=391"}],"version-history":[{"count":0,"href":"https:\/\/blog.liguanxin.cn\/index.php\/wp-json\/wp\/v2\/posts\/391\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/blog.liguanxin.cn\/index.php\/wp-json\/wp\/v2\/media\/396"}],"wp:attachment":[{"href":"https:\/\/blog.liguanxin.cn\/index.php\/wp-json\/wp\/v2\/media?parent=391"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blog.liguanxin.cn\/index.php\/wp-json\/wp\/v2\/categories?post=391"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blog.liguanxin.cn\/index.php\/wp-json\/wp\/v2\/tags?post=391"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}