Difference between revisions of "MapR: Testing"

From Define Wiki
Jump to navigation Jump to search
(Created page with "== Test MapR Installation with pre-installed examples == <syntaxhighlight> [mapr@smuk01 ~]$ cd /opt/mapr/hadoop/hadoop-2.5.1/share/hadoop/mapreduce/ [mapr@smuk01 mapreduce]$ hadoop jar ...")
 
Line 38: Line 38:
 
Number of Maps  = 10
 
Number of Maps  = 10
 
Samples per Map = 10
 
Samples per Map = 10
Wrote input for Map #0
+
...
Wrote input for Map #1
 
Wrote input for Map #2
 
Wrote input for Map #3
 
Wrote input for Map #4
 
Wrote input for Map #5
 
Wrote input for Map #6
 
Wrote input for Map #7
 
Wrote input for Map #8
 
Wrote input for Map #9
 
Starting Job
 
15/03/27 03:17:20 INFO client.RMProxy: Connecting to ResourceManager at smuk01/172.18.156.21:8032
 
15/03/27 03:17:20 INFO input.FileInputFormat: Total input paths to process : 10
 
15/03/27 03:17:20 INFO mapreduce.JobSubmitter: number of splits:10
 
15/03/27 03:17:21 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1427394736582_0325
 
15/03/27 03:17:21 INFO security.ExternalTokenManagerFactory: Initialized external token manager class - com.mapr.hadoop.yarn.security.MapRTicketManager
 
15/03/27 03:17:21 INFO impl.YarnClientImpl: Submitted application application_1427394736582_0325
 
15/03/27 03:17:21 INFO mapreduce.Job: The url to track the job: http://smuk01:8088/proxy/application_1427394736582_0325/
 
15/03/27 03:17:21 INFO mapreduce.Job: Running job: job_1427394736582_0325
 
15/03/27 03:17:26 INFO mapreduce.Job: Job job_1427394736582_0325 running in uber mode : false
 
 
15/03/27 03:17:26 INFO mapreduce.Job:  map 0% reduce 0%
 
15/03/27 03:17:26 INFO mapreduce.Job:  map 0% reduce 0%
 
15/03/27 03:17:30 INFO mapreduce.Job:  map 10% reduce 0%
 
15/03/27 03:17:30 INFO mapreduce.Job:  map 10% reduce 0%
Line 67: Line 48:
 
15/03/27 03:17:39 INFO mapreduce.Job:  map 100% reduce 100%
 
15/03/27 03:17:39 INFO mapreduce.Job:  map 100% reduce 100%
 
15/03/27 03:17:39 INFO mapreduce.Job: Job job_1427394736582_0325 completed successfully
 
15/03/27 03:17:39 INFO mapreduce.Job: Job job_1427394736582_0325 completed successfully
15/03/27 03:17:39 INFO mapreduce.Job: Counters: 46
+
...
File System Counters
 
FILE: Number of bytes read=0
 
FILE: Number of bytes written=886929
 
FILE: Number of read operations=0
 
FILE: Number of large read operations=0
 
FILE: Number of write operations=0
 
MAPRFS: Number of bytes read=3162
 
MAPRFS: Number of bytes written=897
 
MAPRFS: Number of read operations=385
 
MAPRFS: Number of large read operations=0
 
MAPRFS: Number of write operations=248
 
Job Counters
 
Launched map tasks=10
 
Launched reduce tasks=1
 
Data-local map tasks=10
 
Total time spent by all maps in occupied slots (ms)=18047
 
Total time spent by all reduces in occupied slots (ms)=5610
 
Total time spent by all map tasks (ms)=18047
 
Total time spent by all reduce tasks (ms)=1870
 
Total vcore-seconds taken by all map tasks=18047
 
Total vcore-seconds taken by all reduce tasks=1870
 
Total megabyte-seconds taken by all map tasks=18480128
 
Total megabyte-seconds taken by all reduce tasks=5744640
 
DISK_MILLIS_MAPS=9026
 
DISK_MILLIS_REDUCES=2487
 
Map-Reduce Framework
 
Map input records=10
 
Map output records=20
 
Map output bytes=180
 
Map output materialized bytes=0
 
Input split bytes=1340
 
Combine input records=0
 
Combine output records=0
 
Reduce input groups=2
 
Reduce shuffle bytes=240
 
Reduce input records=20
 
Reduce output records=0
 
Spilled Records=40
 
Shuffled Maps =10
 
Failed Shuffles=0
 
Merged Map outputs=11
 
GC time elapsed (ms)=183
 
CPU time spent (ms)=4940
 
Physical memory (bytes) snapshot=3328700416
 
Virtual memory (bytes) snapshot=22652588032
 
Total committed heap usage (bytes)=10051649536
 
Shuffle Errors
 
IO_ERROR=0
 
File Input Format Counters
 
Bytes Read=1180
 
File Output Format Counters
 
Bytes Written=97
 
 
Job Finished in 19.002 seconds
 
Job Finished in 19.002 seconds
 
Estimated value of Pi is 3.20000000000000000000
 
Estimated value of Pi is 3.20000000000000000000

Revision as of 10:07, 27 March 2015

Test MapR Installation with pre-installed examples

<syntaxhighlight> [mapr@smuk01 ~]$ cd /opt/mapr/hadoop/hadoop-2.5.1/share/hadoop/mapreduce/ [mapr@smuk01 mapreduce]$ hadoop jar hadoop-mapreduce-examples-2.5.1-mapr-1501.jar An example program must be given as the first argument. Valid program names are:

 aggregatewordcount: An Aggregate based map/reduce program that counts the words in the input files.
 aggregatewordhist: An Aggregate based map/reduce program that computes the histogram of the words in the input files.
 bbp: A map/reduce program that uses Bailey-Borwein-Plouffe to compute exact digits of Pi.
 blocklocality: Checking Map job locality
 dbcount: An example job that count the pageview counts from a database.
 distbbp: A map/reduce program that uses a BBP-type formula to compute exact bits of Pi.
 grep: A map/reduce program that counts the matches of a regex in the input.
 join: A job that effects a join over sorted, equally partitioned datasets
 multifilewc: A job that counts words from several files.
 pentomino: A map/reduce tile laying program to find solutions to pentomino problems.
 pi: A map/reduce program that estimates Pi using a quasi-Monte Carlo method.
 randomtextwriter: A map/reduce program that writes 10GB of random textual data per node.
 randomwriter: A map/reduce program that writes 10GB of random data per node.
 secondarysort: An example defining a secondary sort to the reduce.
 sleep: A job that sleeps at each map and reduce task.
 sort: A map/reduce program that sorts the data written by the random writer.
 sudoku: A sudoku solver.
 terachecksum: Compute checksum of terasort output to compare with teragen checksum.
 teragen: Generate data for the terasort
 teragenwithcrc: Generate data for the terasort with CRC checksum
 terasort: Run the terasort
 terasortwithcrc: Run the terasort with CRC checksum
 teravalidate: Checking results of terasort
 teravalidaterecords: Checking results of terasort in terms of missing/duplicate records
 teravalidatewithcrc: Checking results of terasort along with crc verification
 wordcount: A map/reduce program that counts the words in the input files.
 wordmean: A map/reduce program that counts the average length of the words in the input files.
 wordmedian: A map/reduce program that counts the median length of the words in the input files.
 wordstandarddeviation: A map/reduce program that counts the standard deviation of the length of the words in the input files.

[mapr@smuk01 mapreduce]$ hadoop jar hadoop-mapreduce-examples-2.5.1-mapr-1501.jar pi 10 10 Number of Maps = 10 Samples per Map = 10 ... 15/03/27 03:17:26 INFO mapreduce.Job: map 0% reduce 0% 15/03/27 03:17:30 INFO mapreduce.Job: map 10% reduce 0% 15/03/27 03:17:31 INFO mapreduce.Job: map 20% reduce 0% 15/03/27 03:17:32 INFO mapreduce.Job: map 40% reduce 0% 15/03/27 03:17:33 INFO mapreduce.Job: map 50% reduce 0% 15/03/27 03:17:34 INFO mapreduce.Job: map 80% reduce 0% 15/03/27 03:17:35 INFO mapreduce.Job: map 100% reduce 0% 15/03/27 03:17:39 INFO mapreduce.Job: map 100% reduce 100% 15/03/27 03:17:39 INFO mapreduce.Job: Job job_1427394736582_0325 completed successfully ... Job Finished in 19.002 seconds Estimated value of Pi is 3.20000000000000000000