07:09:18,335 INFO Test worker context.TestContextManager - @TestExecutionListeners is not present for class [class org.springframework.data.hadoop.scripting.ScriptingBatchTest]: using defaults.
07:09:18,337 INFO Test worker xml.XmlBeanDefinitionReader - Loading XML bean definitions from class path resource [org/springframework/data/hadoop/scripting/ScriptingBatchTest-context.xml]
07:09:18,381 INFO Test worker xml.XmlBeanDefinitionReader - Loading XML bean definitions from class path resource [org/springframework/data/hadoop/batch-common.xml]
07:09:18,400 INFO Test worker xml.XmlBeanDefinitionReader - Loading XML bean definitions from class path resource [org/springframework/data/hadoop/hadoop-ctx.xml]
07:09:18,432 INFO Test worker support.DefaultListableBeanFactory - Overriding bean definition for bean 'mainJob': replacing [Generic bean: class [org.springframework.batch.core.configuration.xml.SimpleFlowFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Generic bean: class [org.springframework.batch.core.configuration.xml.JobParserJobFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null]
07:09:18,434 INFO Test worker support.GenericApplicationContext - Refreshing org.springframework.context.support.GenericApplicationContext@7fb5438d: startup date [Thu Sep 05 07:09:18 PDT 2013]; root of context hierarchy
07:09:18,457 INFO Test worker config.PropertyPlaceholderConfigurer - Loading properties file from class path resource [test.properties]
07:09:18,480 INFO Test worker support.DefaultListableBeanFactory - Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@57003d5d: defining beans [jobRepository,transactionManager,jobLauncher,taskExecutor,ppc,hadoopFs,hadoopResourceLoader,hadoopConfiguration,cfg-init,fs-init,rl-init,org.springframework.data.hadoop.scripting.HdfsScriptRunner#0,org.springframework.batch.core.scope.internalStepScope,org.springframework.beans.factory.config.CustomEditorConfigurer,org.springframework.batch.core.configuration.xml.CoreNamespacePostProcessor,bean,ns,ns2,mainJob,tasklet,script,script-tasklet,org.springframework.data.hadoop.scripting.HdfsScriptRunner#1,nested-script-tasklet,org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor]; root of factory hierarchy
07:09:18,642 INFO Test worker support.SimpleJobLauncher - Job: [FlowJob: [name=mainJob]] launched with the following parameters: [{}]
07:09:18,661 INFO Test worker job.SimpleStepHandler - Executing step: [bean]
07:09:18,710 INFO Test worker job.SimpleStepHandler - Executing step: [ns]
hdfs://w1-kodiak-hd023:8020
Created file c12d7995-582a-4b81-850e-915f6676298d
File content is #
# Hadoop/Hive/Pig settings used for settings
#
# Note: the system properties are checked first - that is the command-line takes precedence (useful for CI)
# Amazon EMR
# hive.port=10003
# hd.fs=s3n://work-emr/tmp
# hd.jt=localhost:20001
# hd.jt=10.80.205.79:9001
# Apache Whirr - EC2
# hd.fs=hdfs://xxx.amazonaws.com:8020
# hd.jt=xxx.amazonaws.com:8021
# Default - Vanilla Installs
#hd.fs=hdfs://${hadoop.host:localhost}:${hadoop.port:9000}
#
# Hadoop settings
#
## M&R
hadoop.fs=${hd.fs|hdfs://localhost:8020}
hadoop.jt=${hd.jt|local}
## Hive
hive.host=${hd.hive.host|localhost}
hive.port=${hd.hive.port|10000}
hive.url=jdbc:hive://${hive.host}:${hive.port}
## HBase
hbase.host=${hd.hbase.host|localhost}
hbase.port=${hd.hbase.port|2181}
## Pig Execution
## externalized since CDH3 VM Pig install has issues (SL4J CNFE)
pig.execution=${hd.pig.exec|LOCAL}
#
# Other settings
#
#path.cat=bin${file.separator}stream-bin${file.separator}cat
#path.wc=bin${file.separator}stream-bin${file.separator}wc
path.cat=cat
path.wc=wc
input.directory=logs
log.input=/logs/input/
log.output=/logs/output/
distcp.src=${hadoop.fs}/distcp/source.txt
distcp.dst=${hadoop.fs}/distcp/dst
cascade.sec=/test/cascading/loganalysis/sec
cascade.min=/test/cascading/loganalysis/min
drwx------ - bamboo supergroup 0 2013-09-05 07:09 /user/bamboo/script-dir
-rwxr-xr-x 3 bamboo supergroup 1271 2013-09-05 07:09 /user/bamboo/script-dir/c12d7995-582a-4b81-850e-915f6676298d
07:09:18,855 INFO Test worker job.SimpleStepHandler - Executing step: [ns2]
drwxrwxrwx - bamboo supergroup 0 2013-08-19 12:16 /
drwxrwxrwx - bamboo supergroup 0 2013-08-19 11:25 /app
drwxr-xr-x - root supergroup 0 2013-09-05 06:03 /batch
drwxrwxrwx - root supergroup 0 2013-09-05 06:03 /batch-param-test
drwxrwxrwx - root supergroup 0 2013-09-05 06:02 /ide-test
drwxrwxrwx - bamboo supergroup 0 2013-08-19 11:24 /mapred
drwxrwxrwx - root supergroup 0 2013-09-05 07:09 /test
drwxrwxrwx - bamboo supergroup 0 2013-09-05 06:05 /tmp
drwxrwxrwx - bamboo supergroup 0 2013-08-19 11:49 /user
07:09:18,902 INFO Test worker support.SimpleJobLauncher - Job: [FlowJob: [name=mainJob]] completed with the following parameters: [{}] and the following status: [COMPLETED]