当前位置: 首页 > news >正文

做自己的视频网站河北网站建设免费推荐

做自己的视频网站,河北网站建设免费推荐,网站开发大赛发言稿,求手机网址对于fcn#xff0c;经常要使用到Deconvolution进行上采样。对于caffe使用者#xff0c;使用Deconvolution上采样#xff0c;其参数往往直接给定#xff0c;不需要通过学习获得。 给定参数的方式很有意思#xff0c;可以通过两种方式实现#xff0c;但是这两种方式并非完…对于fcn经常要使用到Deconvolution进行上采样。对于caffe使用者使用Deconvolution上采样其参数往往直接给定不需要通过学习获得。 给定参数的方式很有意思可以通过两种方式实现但是这两种方式并非完全等价各有各的价值。 第一种方式 通过net_surgery给定 这种方式最开始出现在FCN中。https://github.com/shelhamer/fcn.berkeleyvision.org/blob/master/voc-fcn32s/solve.py 代码如下 import caffe import surgery, scoreimport numpy as np import os import systry:import setproctitlesetproctitle.setproctitle(os.path.basename(os.getcwd())) except:passweights ../ilsvrc-nets/vgg16-fcn.caffemodel# init caffe.set_device(int(sys.argv[1])) caffe.set_mode_gpu()solver caffe.SGDSolver(solver.prototxt) solver.net.copy_from(weights)# surgeries 这里就是对于反卷积层的参数进行初始化 interp_layers [k for k in solver.net.params.keys() if up in k] surgery.interp(solver.net, interp_layers)# scoring val np.loadtxt(../data/segvalid11.txt, dtypestr)for _ in range(25):solver.step(4000)score.seg_tests(solver, False, val, layerscore) 上采样的函数 # make a bilinear interpolation kerneldef upsample_filt(self,size):factor (size 1) // 2if size % 2 1:center factor - 1else:center factor - 0.5og np.ogrid[:size, :size]return (1 - abs(og[0] - center) / factor) * \(1 - abs(og[1] - center) / factor)# set parameters s.t. deconvolutional layers compute bilinear interpolation# N.B. this is for deconvolution without groupsdef interp_surgery(self,net, layers):for l in layers:print lm, k, h, w net.params[l][0].data.shape #仅仅修改w不需要修改bias其为0print(deconv shape:\n)print m, k, h, w if m ! k and k ! 1:print input output channels need to be the same or |output| 1raiseif h ! w:print filters need to be squareraisefilt self.upsample_filt(h)print(filt)net.params[l][0].data[range(m), range(k), :, :] filt 第二种方式直接在Deconvolution中给定参数weight_filler即 代码如下: layer {name: fc8_upsampletype: Deconvolutionbottom: fc8top: fc8_upsampleparam {lr_mult: 0decay_mult: 0}param {lr_mult: 0decay_mult: 0}convolution_param {num_output: 1kernel_size: 16stride: 8pad: 3weight_filler { # 这里相当于上面的直接赋值type: bilinear}} } weight_filler初始化成双线性就等价于直接按照上面的方式赋值。 看起来好像以上两种方法一样但是实际上有不同。主要区别在对于num_output1的情形。 比如对于一个输入是2个通道的map希望对其进行上采样自然我们希望分别对于map放大即可。如果使用Deconvolution则shape大小为2,2,16,16设其大小为16*16.不考虑bias项。 假设按照上面的方式初始化则对于第一种方法得到结果 [0,0,:,:]: [[ 0.00390625 0.01171875 0.01953125 0.02734375 0.03515625 0.04296875 0.05078125 0.05859375 0.05859375 0.05078125 0.04296875 0.03515625 0.02734375 0.01953125 0.01171875 0.00390625] [ 0.01171875 0.03515625 0.05859375 0.08203125 0.10546875 0.12890625 0.15234375 0.17578125 0.17578125 0.15234375 0.12890625 0.10546875 0.08203125 0.05859375 0.03515625 0.01171875] [ 0.01953125 0.05859375 0.09765625 0.13671875 0.17578125 0.21484375 0.25390625 0.29296875 0.29296875 0.25390625 0.21484375 0.17578125 0.13671875 0.09765625 0.05859375 0.01953125] [ 0.02734375 0.08203125 0.13671875 0.19140625 0.24609375 0.30078125 0.35546875 0.41015625 0.41015625 0.35546875 0.30078125 0.24609375 0.19140625 0.13671875 0.08203125 0.02734375] [ 0.03515625 0.10546875 0.17578125 0.24609375 0.31640625 0.38671875 0.45703125 0.52734375 0.52734375 0.45703125 0.38671875 0.31640625 0.24609375 0.17578125 0.10546875 0.03515625] [ 0.04296875 0.12890625 0.21484375 0.30078125 0.38671875 0.47265625 0.55859375 0.64453125 0.64453125 0.55859375 0.47265625 0.38671875 0.30078125 0.21484375 0.12890625 0.04296875] [ 0.05078125 0.15234375 0.25390625 0.35546875 0.45703125 0.55859375 0.66015625 0.76171875 0.76171875 0.66015625 0.55859375 0.45703125 0.35546875 0.25390625 0.15234375 0.05078125] [ 0.05859375 0.17578125 0.29296875 0.41015625 0.52734375 0.64453125 0.76171875 0.87890625 0.87890625 0.76171875 0.64453125 0.52734375 0.41015625 0.29296875 0.17578125 0.05859375] [ 0.05859375 0.17578125 0.29296875 0.41015625 0.52734375 0.64453125 0.76171875 0.87890625 0.87890625 0.76171875 0.64453125 0.52734375 0.41015625 0.29296875 0.17578125 0.05859375] [ 0.05078125 0.15234375 0.25390625 0.35546875 0.45703125 0.55859375 0.66015625 0.76171875 0.76171875 0.66015625 0.55859375 0.45703125 0.35546875 0.25390625 0.15234375 0.05078125] [ 0.04296875 0.12890625 0.21484375 0.30078125 0.38671875 0.47265625 0.55859375 0.64453125 0.64453125 0.55859375 0.47265625 0.38671875 0.30078125 0.21484375 0.12890625 0.04296875] [ 0.03515625 0.10546875 0.17578125 0.24609375 0.31640625 0.38671875 0.45703125 0.52734375 0.52734375 0.45703125 0.38671875 0.31640625 0.24609375 0.17578125 0.10546875 0.03515625] [ 0.02734375 0.08203125 0.13671875 0.19140625 0.24609375 0.30078125 0.35546875 0.41015625 0.41015625 0.35546875 0.30078125 0.24609375 0.19140625 0.13671875 0.08203125 0.02734375] [ 0.01953125 0.05859375 0.09765625 0.13671875 0.17578125 0.21484375 0.25390625 0.29296875 0.29296875 0.25390625 0.21484375 0.17578125 0.13671875 0.09765625 0.05859375 0.01953125] [ 0.01171875 0.03515625 0.05859375 0.08203125 0.10546875 0.12890625 0.15234375 0.17578125 0.17578125 0.15234375 0.12890625 0.10546875 0.08203125 0.05859375 0.03515625 0.01171875] [ 0.00390625 0.01171875 0.01953125 0.02734375 0.03515625 0.04296875 0.05078125 0.05859375 0.05859375 0.05078125 0.04296875 0.03515625 0.02734375 0.01953125 0.01171875 0.00390625]] [0,1,:,:]: [[ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] [1,0,:,:] [[ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.] [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] [1,1,:,:]: [[ 0.00390625 0.01171875 0.01953125 0.02734375 0.03515625 0.04296875 0.05078125 0.05859375 0.05859375 0.05078125 0.04296875 0.03515625 0.02734375 0.01953125 0.01171875 0.00390625] [ 0.01171875 0.03515625 0.05859375 0.08203125 0.10546875 0.12890625 0.15234375 0.17578125 0.17578125 0.15234375 0.12890625 0.10546875 0.08203125 0.05859375 0.03515625 0.01171875] [ 0.01953125 0.05859375 0.09765625 0.13671875 0.17578125 0.21484375 0.25390625 0.29296875 0.29296875 0.25390625 0.21484375 0.17578125 0.13671875 0.09765625 0.05859375 0.01953125] [ 0.02734375 0.08203125 0.13671875 0.19140625 0.24609375 0.30078125 0.35546875 0.41015625 0.41015625 0.35546875 0.30078125 0.24609375 0.19140625 0.13671875 0.08203125 0.02734375] [ 0.03515625 0.10546875 0.17578125 0.24609375 0.31640625 0.38671875 0.45703125 0.52734375 0.52734375 0.45703125 0.38671875 0.31640625 0.24609375 0.17578125 0.10546875 0.03515625] [ 0.04296875 0.12890625 0.21484375 0.30078125 0.38671875 0.47265625 0.55859375 0.64453125 0.64453125 0.55859375 0.47265625 0.38671875 0.30078125 0.21484375 0.12890625 0.04296875] [ 0.05078125 0.15234375 0.25390625 0.35546875 0.45703125 0.55859375 0.66015625 0.76171875 0.76171875 0.66015625 0.55859375 0.45703125 0.35546875 0.25390625 0.15234375 0.05078125] [ 0.05859375 0.17578125 0.29296875 0.41015625 0.52734375 0.64453125 0.76171875 0.87890625 0.87890625 0.76171875 0.64453125 0.52734375 0.41015625 0.29296875 0.17578125 0.05859375] [ 0.05859375 0.17578125 0.29296875 0.41015625 0.52734375 0.64453125 0.76171875 0.87890625 0.87890625 0.76171875 0.64453125 0.52734375 0.41015625 0.29296875 0.17578125 0.05859375] [ 0.05078125 0.15234375 0.25390625 0.35546875 0.45703125 0.55859375 0.66015625 0.76171875 0.76171875 0.66015625 0.55859375 0.45703125 0.35546875 0.25390625 0.15234375 0.05078125] [ 0.04296875 0.12890625 0.21484375 0.30078125 0.38671875 0.47265625 0.55859375 0.64453125 0.64453125 0.55859375 0.47265625 0.38671875 0.30078125 0.21484375 0.12890625 0.04296875] [ 0.03515625 0.10546875 0.17578125 0.24609375 0.31640625 0.38671875 0.45703125 0.52734375 0.52734375 0.45703125 0.38671875 0.31640625 0.24609375 0.17578125 0.10546875 0.03515625] [ 0.02734375 0.08203125 0.13671875 0.19140625 0.24609375 0.30078125 0.35546875 0.41015625 0.41015625 0.35546875 0.30078125 0.24609375 0.19140625 0.13671875 0.08203125 0.02734375] [ 0.01953125 0.05859375 0.09765625 0.13671875 0.17578125 0.21484375 0.25390625 0.29296875 0.29296875 0.25390625 0.21484375 0.17578125 0.13671875 0.09765625 0.05859375 0.01953125] [ 0.01171875 0.03515625 0.05859375 0.08203125 0.10546875 0.12890625 0.15234375 0.17578125 0.17578125 0.15234375 0.12890625 0.10546875 0.08203125 0.05859375 0.03515625 0.01171875] [ 0.00390625 0.01171875 0.01953125 0.02734375 0.03515625 0.04296875 0.05078125 0.05859375 0.05859375 0.05078125 0.04296875 0.03515625 0.02734375 0.01953125 0.01171875 0.00390625]] 而第二种方式全部都是[0,0,:,:]这样的矩阵。 以上两种方法应该是第一种对的。因为Deconvolution 其实与卷积类似按照第一种结果才能分别单独地对map上采样而采用第二种则将会得到两个相同的map。因为综合了两个输入map的信息 因此结论 对于多个输入输出的Deconvolution,采用方法1对于单个输入的方法1,2通用。 附上Deconvolution的官方编码 说明 以上的称述有点瑕疵其实caffe已经解决了上述的问题我之前没有好好留意。 关键就在group这个选项。 如果num_output1,则填上group: c 再加上weight_filler: { type: “bilinear” }即可完成初始化。
http://www.huolong8.cn/news/236458/

相关文章:

  • 专业制作开发公司网站如何在免费网站上做推扩
  • 乐清网站优化推广互联网代理
  • 看想看的做想做的电影网站嘉定企业网站建设
  • 成都 视频网站建设域名查询最新版
  • 网站式登录页面模板下载临泉建设网站
  • 潍坊+网站建设深圳网站建设哪个好
  • 高端网站建设的价格大学生服装网站建设策划书
  • 网站域名登陆地址查询服装网站建设比较好
  • wordpress 站点换域名广东建筑企业50强
  • 免费静态网站托管平台基础网络建设
  • 政务网站开发做淘宝客建网站要多少费用
  • 做商城网站哪里买湖北省建设厅质监站网站
  • 网站开发济南招聘杭州网站建设哪家公司好
  • 如何做自己网站的访问记录如何提高网站收录数
  • 个人备案能做企业网站吗安阳哪里做网站
  • 十大免费建站程序wordpress手机app软件开发软件
  • 广告片宣传片拍摄seo推广是什么
  • 娱乐新闻做的好的网站WordPress更换域名之后
  • 西安+医疗网站建设网络推广策划案范文5篇
  • 湖南株洲建设局网站wordpress更换icon
  • 网站开发网站网站建设的探讨与研究
  • 携程网站联盟最好的产品网站建设
  • 十堰网站整站优化公司做平台
  • 如何让网站排名下降开发一个相亲软件需要多少钱
  • 全国建设部网站官网教育培训手机网站模板下载
  • 公司网站建设推进表猎头公司有哪些
  • 深圳团购网站建设营销网站模版
  • 网站建设宣传册内容文档瑞安网站建设优化
  • 正规外贸网站建设公司高端品牌网站设计欣赏
  • 关于网站开发的创业计划书阿里云网站备案流程