当前位置: 首页 > news >正文

北京产品网站设计哪家专业网站后台放在哪里

北京产品网站设计哪家专业,网站后台放在哪里,做网站公司昆山,企业营销网站建设费用预算文章目录1. MapReduce 作业流程2. 实践2.1 启动 hadoop2.2 创建 java 项目2.3 MapReduce shell2.4 MapReduce Web UI3. MapReduce 编程实践#xff1a;统计对象中的某些属性参考书#xff1a;《Hadoop大数据原理与应用》1. MapReduce 作业流程 2. 实践 2.1 启动 hadoop sta… 文章目录1. MapReduce 作业流程2. 实践2.1 启动 hadoop2.2 创建 java 项目2.3 MapReduce shell2.4 MapReduce Web UI3. MapReduce 编程实践统计对象中的某些属性参考书《Hadoop大数据原理与应用》1. MapReduce 作业流程 2. 实践 2.1 启动 hadoop start-dfs.sh start-yarn.sh mr-jobhistory-daemon.sh start historyserver # 第三条可以用下面的命令上面的显示过期了以后弃用 mapred --daemon start historyserver2.2 创建 java 项目 WordCountMapper.java package com.michael.mapreduce;import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.LongWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Mapper; import java.io.IOException;public class WordCountMapper extends MapperLongWritable, Text, Text, IntWritable{//self define map method 自定义 map 方法Overrideprotected void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException{String line value.toString();String[] words line.split( );for(String word : words)context.write(new Text(word), new IntWritable(1));// context.write() give to next stage: shuffle} }WordCountReducer.java package com.michael.mapreduce;import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Reducer; import java.io.IOException;public class WordCountReducer extends ReducerText, IntWritable, Text, IntWritable{// 自定义 reduce 方法Overrideprotected void reduce(Text key, IterableIntWritable values, Context context) throwsIOException, InterruptedException{int sum 0;for(IntWritable value : values)sum value.get();context.write(key, new IntWritable(sum));} }WordCountDriver.javadirver 类设置本次 job package com.michael.mapreduce;import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.Path; import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.io.compress.BZip2Codec; import org.apache.hadoop.io.compress.CompressionCodec; import org.apache.hadoop.mapreduce.Job; import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;import java.io.IOException;public class WordCountDriver {// args 参数 输入输出文件路径public static void main(String[] args) throws IOException, ClassNotFoundException, InterruptedException{Configuration conf new Configuration();// map compress 开启 map 阶段的压缩conf.setBoolean(mapreduce.map.output.compress, true);// compress type指定压缩类型conf.setClass(mapreduce.map.output.compress.codec, BZip2Codec.class, CompressionCodec.class);Job job Job.getInstance(conf, word count diy:);job.setJarByClass(WordCountDriver.class);job.setMapperClass(WordCountMapper.class);// 自定义 Combinejob.setCombinerClass(WordCountReducer.class);job.setReducerClass(WordCountReducer.class);// 指定 map 输出数据的类型job.setMapOutputKeyClass(Text.class);job.setMapOutputValueClass(IntWritable.class);// 指定 reduce 输出数据类型job.setOutputKeyClass(Text.class);job.setOutputValueClass(IntWritable.class);// 设置输入文件路径FileInputFormat.setInputPaths(job, new Path(args[0]));// 设置输出文件路径FileOutputFormat.setOutputPath(job, new Path(args[1]));// 开启 reduce 阶段的解压缩FileOutputFormat.setCompressOutput(job, true);// 指定解压缩类型跟上面压缩类型一致FileOutputFormat.setOutputCompressorClass(job, BZip2Codec.class);boolean result job.waitForCompletion(true);System.exit(result ? 0 : 1);} }导出 wordcount_diy.jar - 提交hadoop执行 hadoop jar /home/dnn/eclipse-workspace/HDFS_example/wordcount_diy.jar com.michael.mapreduce.WordCountDriver /InputDataTest /OutputDataTest1查看结果 hdfs dfs -cat /OutputDataTesdfs dfs -cat /OutputDataTest1/part-r-00000.bz2显示乱码需要下载然后解压再查看 下载 hdfs dfs -get /OutputDataTest1/part-r-00000.bz2 /home/dnn/eclipse-workspace/HDFS_example/part-r-00000.bz2查看 bzcat /home/dnn/eclipse-workspace/HDFS_example/part-r-00000.bz22.3 MapReduce shell 查看作业状态 mapred job -status job_1615849408082_0001[dnnmaster Desktop]$ mapred job -status job_1615849408082_0001 WARNING: HADOOP_MAPRED_PID_DIR has been replaced by HADOOP_PID_DIR. Using value of HADOOP_MAPRED_PID_DIR. 2021-03-26 04:25:14,881 INFO client.DefaultNoHARMFailoverProxyProvider: Connecting to ResourceManager at master/192.168.253.130:8032 2021-03-26 04:25:15,939 INFO mapred.ClientServiceDelegate: Application state is completed. FinalApplicationStatusSUCCEEDED. Redirecting to job history serverJob: job_1615849408082_0001 Job File: hdfs://192.168.253.130:9000/tmp/hadoop-yarn/staging/history/done/2021/03/24/000000/job_1615849408082_0001_conf.xml Job Tracking URL : http://master:19888/jobhistory/job/job_1615849408082_0001 Uber job : false Number of maps: 3 Number of reduces: 1 map() completion: 1.0 reduce() completion: 1.0 Job state: SUCCEEDED retired: false reason for failure: Counters: 54File System CountersFILE: Number of bytes read6640FILE: Number of bytes written1072644FILE: Number of read operations0FILE: Number of large read operations0FILE: Number of write operations0HDFS: Number of bytes read25631HDFS: Number of bytes written4967HDFS: Number of read operations14HDFS: Number of large read operations0HDFS: Number of write operations2HDFS: Number of bytes read erasure-coded0Job Counters Launched map tasks3Launched reduce tasks1Data-local map tasks3Total time spent by all maps in occupied slots (ms)43801Total time spent by all reduces in occupied slots (ms)5037Total time spent by all map tasks (ms)43801Total time spent by all reduce tasks (ms)5037Total vcore-milliseconds taken by all map tasks43801Total vcore-milliseconds taken by all reduce tasks5037Total megabyte-milliseconds taken by all map tasks44852224Total megabyte-milliseconds taken by all reduce tasks5157888Map-Reduce FrameworkMap input records667Map output records3833Map output bytes40605Map output materialized bytes8455Input split bytes358Combine input records3833Combine output records1264Reduce input groups913Reduce shuffle bytes8455Reduce input records1264Reduce output records913Spilled Records2528Shuffled Maps 3Failed Shuffles0Merged Map outputs3GC time elapsed (ms)818CPU time spent (ms)3140Physical memory (bytes) snapshot599461888Virtual memory (bytes) snapshot10950950912Total committed heap usage (bytes)385351680Peak Map Physical memory (bytes)167784448Peak Map Virtual memory (bytes)2735529984Peak Reduce Physical memory (bytes)96972800Peak Reduce Virtual memory (bytes)2744360960Shuffle ErrorsBAD_ID0CONNECTION0IO_ERROR0WRONG_LENGTH0WRONG_MAP0WRONG_REDUCE0File Input Format Counters Bytes Read25273File Output Format Counters Bytes Written49672.4 MapReduce Web UI http://192.168.253.130:19888/jobhistory3. MapReduce 编程实践统计对象中的某些属性 MapReduce 编程实践统计对象中的某些属性
http://www.yutouwan.com/news/81139/

相关文章:

  • 收废品做网站怎么做建筑人才网最新招聘信息息
  • 做网站的无锡大学生网页设计作业代码
  • 重庆万州网站建设哪家好北京传媒公司
  • 南宁优质手机网站建设公司建设银行官网首页登录入口
  • 网站建设方案可行性秦皇岛市建设银行网点
  • 自学it做网站大连网站怎么推广
  • 满屏网站设计做多大上海专业网站建设渠道
  • 免费做相册video的网站做网站要买什么服务器
  • php网站游客试用怎么做网站备案核验单
  • 网站建设与运营固定资产jsp 数据库做网站
  • 网站到底是域名需要备案还是空间做网站的主要作用
  • 程序员和网站建设网页设计考试
  • seo整站优化方案番禺24小时核酸检测
  • 自己做都网站怎么发朋友圈保定高端网站建设
  • 网站建设代码实例网站开发和合同范本
  • 合肥建设网官方网站在线制作网站的工具
  • 广州网站建设专注乐云seo建设银行网络平台
  • 大型企业网站源码wordpress 管理员权限设置密码
  • 网站站内链接百度联系方式
  • 湖北微网站建设多少钱中国海员建设工会网站
  • 开封市做网站的公司在线制作国庆头像
  • 网站的主要内容淘宝联盟网页版
  • 统一门户网站南昌p2p网站建设公司
  • 最好的淘宝网站建设wordpress 栏目分页
  • 公司网站建设一条龙可以免费打开网站的软件下载
  • 如何选择丹阳网站建设培训好吗网站建设
  • 阿里云网站建设 部署与发布答案上海远东建筑设计院
  • 用插件做的炫酷网站凡科做网站关键词
  • 旅游网网站的设计做网站在什么地方发帖子呢
  • 东营网签查询系统官方网站2022今天出京入京最新通知