📜  Hadoop-流

📅  最后修改于: 2020-12-01 06:43:42             🧑  作者: Mango


Hadoop流是Hadoop发行版随附的实用程序。使用此实用程序,您可以使用任何可执行文件或脚本作为映射器和/或化简器来创建和运行Map / Reduce作业。

使用Python的示例

对于Hadoop流,我们正在考虑字数问题。 Hadoop中的任何作业都必须具有两个阶段:映射器和化简器。我们已经用Python脚本编写了映射器和化简器的代码,以便在Hadoop下运行它。也可以在Perl和Ruby中编写相同的代码。

映射器阶段代码

!/usr/bin/python

import sys

# Input takes from standard input for myline in sys.stdin: 
   # Remove whitespace either side 
   myline = myline.strip() 

   # Break the line into words 
   words = myline.split() 

   # Iterate the words list
   for myword in words:
      # Write the results to standard output 
      print '%s\t%s' % (myword, 1)

确保此文件具有执行权限(chmod + x / home / expert / hadoop-1.2.1 / mapper.py)。

减速器相码

#!/usr/bin/python

from operator import itemgetter 
import sys 

current_word = ""
current_count = 0 
word = "" 

# Input takes from standard input for myline in sys.stdin: 
   # Remove whitespace either side 
   myline = myline.strip() 

   # Split the input we got from mapper.py word, 
   count = myline.split('\t', 1) 

   # Convert count variable to integer 
   try: 
      count = int(count) 

   except ValueError: 
      # Count was not a number, so silently ignore this line continue

   if current_word == word: 
   current_count += count 
   else: 
      if current_word: 
         # Write result to standard output print '%s\t%s' % (current_word, current_count) 
   
      current_count = count
      current_word = word

# Do not forget to output the last word if needed! 
if current_word == word: 
   print '%s\t%s' % (current_word, current_count)

将映射器和简化器代码保存在Hadoop主目录中的mapper.py和reducer.py中。确保这些文件具有执行权限(chmod + x mapper.py和chmod + x reducer.py)。由于Python缩进敏感,因此可以从下面的链接下载相同的代码。

执行WordCount程序

$ $HADOOP_HOME/bin/hadoop jar contrib/streaming/hadoop-streaming-1.
2.1.jar \
   -input input_dirs \ 
   -output output_dir \ 
   -mapper 

其中“ \”用于行继续以提供清晰的可读性。

例如,

./bin/hadoop jar contrib/streaming/hadoop-streaming-1.2.1.jar -input myinput -output myoutput -mapper /home/expert/hadoop-1.2.1/mapper.py -reducer /home/expert/hadoop-1.2.1/reducer.py

流媒体如何工作

在上面的示例中,映射器和化简器都是Python脚本,它们从标准输入读取输入,并将输出发射到标准输出。该实用程序将创建一个Map / Reduce作业,将该作业提交到适当的群集,并监视该作业的进度,直到完成为止。

为映射器指定脚本后,初始化映射器时,每个映射器任务都会将脚本作为单独的进程启动。映射器任务运行时,它将其输入转换为行,并将行输入到流程的标准输入(STDIN)。同时,映射器从流程的标准输出(STDOUT)收集面向行的输出,并将每行转换为键/值对,并作为映射器的输出进行收集。默认情况下,直到第一个制表字符的行的前缀是键,其余的行(不包括制表字符)将是值。如果该行中没有制表字符,则将整行视为键,并且该值为null。但是,可以根据需要对此进行自定义。

为减速器指定脚本后,每个减速器任务都会将脚本作为单独的进程启动,然后初始化减速器。 reducer任务运行时,它将其输入键/值对转换为行,并将这些行馈送到流程的标准输入(STDIN)。同时,Reducer从过程的标准输出(STDOUT)收集面向行的输出,将每条线转换为键/值对,然后将其作为reducer的输出收集。默认情况下,直到第一个制表字符的行的前缀是键,其余的行(不包括制表字符)是值。但是,可以根据特定要求进行自定义。

重要命令

Parameters Options Description
-input directory/file-name Required Input location for mapper.
-output directory-name Required Output location for reducer.
-mapper executable or script or JavaClassName Required Mapper executable.
-reducer executable or script or JavaClassName Required Reducer executable.
-file file-name Optional Makes the mapper, reducer, or combiner executable available locally on the compute nodes.
-inputformat JavaClassName Optional Class you supply should return key/value pairs of Text class. If not specified, TextInputFormat is used as the default.
-outputformat JavaClassName Optional Class you supply should take key/value pairs of Text class. If not specified, TextOutputformat is used as the default.
-partitioner JavaClassName Optional Class that determines which reduce a key is sent to.
-combiner streamingCommand or JavaClassName Optional Combiner executable for map output.
-cmdenv name=value Optional Passes the environment variable to streaming commands.
-inputreader Optional For backwards-compatibility: specifies a record reader class (instead of an input format class).
-verbose Optional Verbose output.
-lazyOutput Optional Creates output lazily. For example, if the output format is based on FileOutputFormat, the output file is created only on the first call to output.collect (or Context.write).
-numReduceTasks Optional Specifies the number of reducers.
-mapdebug Optional Script to call when map task fails.
-reducedebug Optional Script to call when reduce task fails.