07:09:18,909 INFO Test worker context.TestContextManager - @TestExecutionListeners is not present for class [class org.springframework.data.hadoop.scripting.ScriptingTest]: using defaults.
07:09:18,913 INFO Test worker xml.XmlBeanDefinitionReader - Loading XML bean definitions from class path resource [org/springframework/data/hadoop/scripting/ScriptingTest-context.xml]
07:09:18,944 INFO Test worker xml.XmlBeanDefinitionReader - Loading XML bean definitions from class path resource [org/springframework/data/hadoop/hadoop-ctx.xml]
07:09:18,979 INFO Test worker support.GenericApplicationContext - Refreshing org.springframework.context.support.GenericApplicationContext@1b4a8134: startup date [Thu Sep 05 07:09:18 PDT 2013]; root of context hierarchy
07:09:18,994 INFO Test worker config.PropertyPlaceholderConfigurer - Loading properties file from class path resource [test.properties]
07:09:19,000 INFO Test worker support.DefaultListableBeanFactory - Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@77b8378f: defining beans [ppc,hadoopFs,hadoopResourceLoader,hadoopConfiguration,cfg-init,fs-init,rl-init,org.springframework.data.hadoop.scripting.HdfsScriptRunner#0,script-js,script-py,script-rb,script-groovy,distcp,inlined-js,org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor]; root of factory hierarchy
Home dir is hdfs://w1-kodiak-hd023:8020/user/bamboo
Work dir is hdfs://w1-kodiak-hd023:8020/user/bamboo
/user exists true
File content is #
# Hadoop/Hive/Pig settings used for settings
#
# Note: the system properties are checked first - that is the command-line takes precedence (useful for CI)
# Amazon EMR
# hive.port=10003
# hd.fs=s3n://work-emr/tmp
# hd.jt=localhost:20001
# hd.jt=10.80.205.79:9001
# Apache Whirr - EC2
# hd.fs=hdfs://xxx.amazonaws.com:8020
# hd.jt=xxx.amazonaws.com:8021
# Default - Vanilla Installs
#hd.fs=hdfs://${hadoop.host:localhost}:${hadoop.port:9000}
#
# Hadoop settings
#
## M&R
hadoop.fs=${hd.fs|hdfs://localhost:8020}
hadoop.jt=${hd.jt|local}
## Hive
hive.host=${hd.hive.host|localhost}
hive.port=${hd.hive.port|10000}
hive.url=jdbc:hive://${hive.host}:${hive.port}
## HBase
hbase.host=${hd.hbase.host|localhost}
hbase.port=${hd.hbase.port|2181}
## Pig Execution
## externalized since CDH3 VM Pig install has issues (SL4J CNFE)
pig.execution=${hd.pig.exec|LOCAL}
#
# Other settings
#
#path.cat=bin${file.separator}stream-bin${file.separator}cat
#path.wc=bin${file.separator}stream-bin${file.separator}wc
path.cat=cat
path.wc=wc
input.directory=logs
log.input=/logs/input/
log.output=/logs/output/
distcp.src=${hadoop.fs}/distcp/source.txt
distcp.dst=${hadoop.fs}/distcp/dst
cascade.sec=/test/cascading/loganalysis/sec
cascade.min=/test/cascading/loganalysis/min
drwx------ - bamboo supergroup 0 2013-09-05 07:09 /user/bamboo/script-dir
-rwxr-xr-x 3 bamboo supergroup 1271 2013-09-05 07:09 /user/bamboo/script-dir/d894eec2-ca0d-420d-b1ef-596a337d76d7
07:09:19,208 WARN Test worker scripting.HdfsScriptRunner - No Hadoop Configuration detected; not binding Configuration as variable 'cfg' to script
07:09:19,208 WARN Test worker scripting.HdfsScriptRunner - No Hadoop Configuration or ResourceLoader detected; not binding variable 'hdfsRL' to script
07:09:19,208 WARN Test worker scripting.HdfsScriptRunner - No Hadoop Configuration or FileSystem detected; not binding variable 'fs' to script
07:09:19,208 WARN Test worker scripting.HdfsScriptRunner - No Hadoop Configuration detected; not binding DistCp as variable 'distcp' to script
07:09:19,209 WARN Test worker scripting.HdfsScriptRunner - No Hadoop Configuration detected; not binding FsShell as variable 'fsh' to script