07:12:55,802 INFO Test worker xml.XmlBeanDefinitionReader - Loading XML bean definitions from class path resource [org/springframework/data/hadoop/poc/context.xml]
07:12:55,837 INFO Test worker xml.XmlBeanDefinitionReader - Loading XML bean definitions from class path resource [org/springframework/data/hadoop/batch-common.xml]
07:12:55,850 INFO Test worker xml.XmlBeanDefinitionReader - Loading XML bean definitions from class path resource [org/springframework/data/hadoop/hadoop-ctx.xml]
07:12:55,869 INFO Test worker xml.XmlBeanDefinitionReader - Loading XML bean definitions from class path resource [org/springframework/data/hadoop/poc/int-trigger.xml]
07:12:55,981 INFO Test worker support.DefaultListableBeanFactory - Overriding bean definition for bean 'batch': replacing [Generic bean: class [org.springframework.batch.core.configuration.xml.SimpleFlowFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null] with [Generic bean: class [org.springframework.batch.core.configuration.xml.JobParserJobFactoryBean]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null]
07:12:55,983 INFO Test worker support.GenericXmlApplicationContext - Refreshing org.springframework.context.support.GenericXmlApplicationContext@7f405c3: startup date [Thu Sep 05 07:12:55 PDT 2013]; root of context hierarchy
07:12:55,998 INFO Test worker config.PropertyPlaceholderConfigurer - Loading properties file from class path resource [test.properties]
07:12:56,005 INFO Test worker xml.DefaultConfiguringBeanFactoryPostProcessor - No bean named 'errorChannel' has been explicitly defined. Therefore, a default PublishSubscribeChannel will be created.
07:12:56,005 INFO Test worker xml.DefaultConfiguringBeanFactoryPostProcessor - No bean named 'taskScheduler' has been explicitly defined. Therefore, a default ThreadPoolTaskScheduler will be created.
07:12:56,018 INFO Test worker support.DefaultListableBeanFactory - Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@1fb179dc: defining beans [jobRepository,transactionManager,jobLauncher,taskExecutor,ppc,hadoopFs,hadoopResourceLoader,hadoopConfiguration,cfg-init,fs-init,rl-init,org.springframework.data.hadoop.scripting.HdfsScriptRunner#0,channelInitializer,$autoCreateChannelCandidates,org.springframework.integration.internalDefaultConfiguringBeanFactoryPostProcessor,org.springframework.integration.file.config.FileListFilterFactoryBean#0,org.springframework.integration.file.config.FileReadingMessageSourceFactoryBean#0,org.springframework.scheduling.support.PeriodicTrigger#0,input-logs,in,org.springframework.integration.config.ServiceActivatorFactoryBean#0,org.springframework.integration.config.ConsumerEndpointFactoryBean#0,done,org.springframework.integration.handler.LoggingHandler#0,done.adapter,org.springframework.scheduling.support.PeriodicTrigger#1,hdfs-trigger,logger,org.springframework.integration.handler.LoggingHandler#1,logger.adapter,org.springframework.batch.core.scope.internalStepScope,org.springframework.beans.factory.config.CustomEditorConfigurer,org.springframework.batch.core.configuration.xml.CoreNamespacePostProcessor,execute-mr,execute-streaming,execute-pig,batch,hadoop-mr,hadoop-streaming,mr-job,stream-job,pigFactory,pig-script,file-reader,nullChannel,errorChannel,_org.springframework.integration.errorLogger,taskScheduler,org.springframework.integration.config.IdGeneratorConfigurer#0]; root of factory hierarchy
packageJobJar: [] [/home/bamboo/.gradle/caches/artifacts-15/filestore/org.apache.hadoop/hadoop-streaming/1.2.0/jar/37d96e82162684e35f9c7ead641cc8226bab121c/hadoop-streaming-1.2.0.jar] /tmp/streamjob7585686765789618725.jar tmpDir=null
07:12:56,301 INFO Test worker concurrent.ThreadPoolTaskScheduler - Initializing ExecutorService 'taskScheduler'
07:12:56,307 INFO Test worker support.DefaultLifecycleProcessor - Starting beans in phase -2147483648
07:12:56,307 INFO Test worker endpoint.EventDrivenConsumer - Adding {service-activator} as a subscriber to the 'in' channel
07:12:56,308 INFO Test worker channel.DirectChannel - Channel 'in' has 1 subscriber(s).
07:12:56,308 INFO Test worker endpoint.EventDrivenConsumer - started org.springframework.integration.config.ConsumerEndpointFactoryBean#0
07:12:56,309 INFO Test worker endpoint.EventDrivenConsumer - Adding {logging-channel-adapter:done.adapter} as a subscriber to the 'done' channel
07:12:56,309 INFO Test worker channel.DirectChannel - Channel 'done' has 1 subscriber(s).
07:12:56,309 INFO Test worker endpoint.EventDrivenConsumer - started done.adapter
07:12:56,309 INFO Test worker endpoint.EventDrivenConsumer - Adding {logging-channel-adapter:logger.adapter} as a subscriber to the 'logger' channel
07:12:56,310 INFO Test worker channel.DirectChannel - Channel 'logger' has 1 subscriber(s).
07:12:56,310 INFO Test worker endpoint.EventDrivenConsumer - started logger.adapter
07:12:56,310 INFO Test worker endpoint.EventDrivenConsumer - Adding {logging-channel-adapter:_org.springframework.integration.errorLogger} as a subscriber to the 'errorChannel' channel
07:12:56,311 INFO Test worker channel.PublishSubscribeChannel - Channel 'errorChannel' has 1 subscriber(s).
07:12:56,311 INFO Test worker endpoint.EventDrivenConsumer - started _org.springframework.integration.errorLogger
07:12:56,312 INFO Test worker support.DefaultLifecycleProcessor - Starting beans in phase 2147483647
07:12:56,315 INFO Test worker endpoint.SourcePollingChannelAdapter - started input-logs
07:12:56,315 INFO Test worker endpoint.SourcePollingChannelAdapter - started hdfs-trigger
07:12:56,476 WARN task-scheduler-2 handler.LoggingHandler - [HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/00053437-d3f1-41e9-8f34-cbbca713c629], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/021a76d5-ba0e-4da7-b4da-ea15db3669ff], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/022bda9f-2a20-41ea-9d53-65789ff27b98], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/05647012-4841-4742-813f-a6a15ba405e0], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/05d84974-7561-4ab8-b1b6-6ff1a4b8c99a], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/0c4e3d7f-a1bd-4941-80c3-d7decc56bb61], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/162cd27c-7456-4978-be6e-4a0bddfd2a4d], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/249e526b-4f87-4ca0-a22c-d7c6fcf0ddee], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/2768dffc-261c-4376-bae1-a00889fe4c8a], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/2d7fb48b-ace4-47b8-b4a2-681a53a11fa0], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/3a3d3eae-7fca-4759-937c-a6af00dd11a0], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/3f943ab7-9974-4e51-8a68-18de08bc6d46], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/4535748f-b620-43b9-b730-f27fdf831d6f], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/4d821531-c56c-44f9-b20f-dca14e8db3a6], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/529331cc-fa9f-4cff-a5d4-e8926639ebc2], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/53a9ba56-87e5-47b3-826f-3b0c17288ff8], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/55da2b16-9fda-4255-928b-fafcab271a10], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/57257ff2-3d02-43c7-ab94-04be1b003507], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/6213e7df-7578-4668-b507-27468a76a94b], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/645aa5f7-2226-478a-8d7f-cde78cb4e5b4], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/656ecf53-fee6-431a-ab14-bad8ad9311f8], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/69daceeb-629c-46f5-8e00-134b189ca1a3], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/701fd5a2-641c-4273-805b-c68af6a1e1bc], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/7b871e80-7fb2-4898-8916-df30bd97dc34], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/7c3624df-66d1-4d8a-a4bd-3899189864a8], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/8093a7ff-c19c-4b6d-adc8-cbf324ef3871], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/85c1c466-3274-4241-82f0-c77cb0e5ff4c], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/88d75d4c-541f-4753-9fc0-e6c86d146738], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/8d4d30eb-b584-4373-a99b-a87405e6b0bb], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/92735489-81f0-42ec-9d43-4e86ec7d6263], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/93d7c94f-be75-4bc2-af03-6fa9bb2c93eb], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/971dae2c-12d8-40ae-a559-a0066e7d5694], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/9787c240-b199-462d-b670-ab9fcc4bd528], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/9b0f16de-6882-424d-91ab-414d8ba646ad], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/9f8015e3-098e-403b-8011-34e477954c68], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/a6c64763-f283-4088-a692-8963fcd61782], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/adf4fc51-8f20-40a2-bf7d-77276a5d0a38], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/b05ccc06-042e-4bde-95bd-574d66728d13], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/bede805f-54b0-4a8a-a617-8598c6d511b9], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/c07fdeb7-7a18-4912-be1c-25df42a116d3], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/c0f2c0f0-6135-453f-b647-ec4a24bbbda5], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/c12d7995-582a-4b81-850e-915f6676298d], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/cef8ac9b-8b6b-42b4-96eb-1d22dfa1c6bf], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/d6345d51-1216-4a04-b1ac-7b386aed94b2], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/d6cee4f2-9ae2-4f08-ba70-219e4c8df21c], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/d6d0f8de-d8cc-4cd2-9ea3-76cb0b5bd54a], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/d894eec2-ca0d-420d-b1ef-596a337d76d7], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/e11fc1b5-21cf-4301-bfde-5c963fdf5d91], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/e51d816d-5d60-4e66-ab8f-28ff4e010958], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/ec5dbfc8-8f75-4fa2-bdfe-459ab2f387c1], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/efa0ae76-81a5-40b8-bc26-8f9bdd1e7232], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/f4c78a15-0672-460a-beb1-a985847e4729], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/output/_SUCCESS], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/output/_logs/history/job_201308191125_0564_1378390175016_bamboo_ns-stream-job_default_*], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/output/_logs/history/job_201308191125_0564_conf.xml], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/output/part-00000], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/passwd-test], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/src/test/resources/logs/apache_access.log], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-01187c5c-305a-4a20-9491-bf0fe251194b.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-15f23d66-c583-4fe5-856e-8c3fe84800bb.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-212e4d7e-3e04-4ab4-9dc0-7216f5fa1b8b.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-30579292-fb4b-4cd8-beb3-910b51dba328.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-30f2c283-2e71-495b-98fe-e5d6181112bf.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-43947aaf-5c26-4df9-9f47-88b749ff08b6.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-4f3809d0-d34b-4420-8f3e-aa57a48f5153.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-521b2391-6bc4-4899-aad9-790aaabdd0b6.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-57a2b7a3-a819-4d2c-8c45-fcdad920e27e.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-591f2967-9349-4157-8fbe-a9e82e749ae5.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-5a41060a-c5cd-44be-ad0f-67fa2b102bfc.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-61a70cd0-3bb8-4d3f-ba9a-030bb09827f6.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-751820d0-faae-40d8-9385-45908c57dde4.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-86554873-33ba-470c-b9dd-1fe32724f455.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-98f78d72-7186-41bb-8815-897b5e218b38.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-99c68b43-e803-41a3-8a29-f8133fede1bd.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-9b264ab1-f97c-4449-96d4-90a7ac82f51a.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-a2f4acb7-e9b8-4435-a069-bd35d4fb5546.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-a9151fe3-2ddb-4e58-99cc-4acbd4599aaa.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-c9665f1c-1d17-4ac5-afca-3d08a81adf6b.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-d2e4d686-3690-4f50-8896-4ab80993bdc5.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-d747307d-8ee4-4aba-9bbc-a5f2b6a31571.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-dc201922-62c7-4479-9cd3-06d6143453e7.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-ddf0c7dd-42a4-4098-bee0-04d1d3f9f2d0.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-e31f95de-0796-4212-99f3-239dce9cc8e4.file], HDFS Resource for [hdfs://w1-kodiak-hd023:8020/user/bamboo/test-fdca0e57-b1b7-45d3-9bf8-138d1191535f.file]]
Called script
07:13:05,690 INFO myScheduler-2 mapreduce.JobRunner - Starting job [custom-jar-job]
07:13:05,732 INFO myScheduler-2 mapred.JobClient - Cleaning up the staging area hdfs://w1-kodiak-hd023:8020/app/hadoop/tmp/mapred/staging/bamboo/.staging/job_201308191125_0580
07:13:05,732 ERROR myScheduler-2 security.UserGroupInformation - PriviledgedActionException as:bamboo cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory /ide-test/runner/output-4 already exists
07:13:05,732 WARN myScheduler-2 mapreduce.JobRunner - Cannot start job [custom-jar-job]
org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory /ide-test/runner/output-4 already exists
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:137)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:973)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:416)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:550)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:580)
at org.springframework.data.hadoop.mapreduce.JobExecutor$2.run(JobExecutor.java:201)
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:48)
at org.springframework.data.hadoop.mapreduce.JobExecutor.startJobs(JobExecutor.java:172)
at org.springframework.data.hadoop.mapreduce.JobExecutor.startJobs(JobExecutor.java:164)
at org.springframework.data.hadoop.mapreduce.JobRunner.call(JobRunner.java:52)
at org.springframework.data.hadoop.mapreduce.JobRunner.call(JobRunner.java:33)
at org.springframework.data.hadoop.mapreduce.JobRunner.invoke(JobRunner.java:88)
at org.springframework.data.hadoop.mapreduce.JobRunner.call(JobRunner.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.springframework.util.MethodInvoker.invoke(MethodInvoker.java:273)
at org.springframework.scheduling.support.MethodInvokingRunnable.run(MethodInvokingRunnable.java:65)
at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:51)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:351)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:178)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:165)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:267)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:679)
07:13:05,739 ERROR myScheduler-2 support.MethodInvokingRunnable - Invocation of method 'call' on target class [class org.springframework.data.hadoop.mapreduce.JobRunner] failed
java.lang.IllegalStateException: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory /ide-test/runner/output-4 already exists
at org.springframework.data.hadoop.mapreduce.JobExecutor$2.run(JobExecutor.java:213)
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:48)
at org.springframework.data.hadoop.mapreduce.JobExecutor.startJobs(JobExecutor.java:172)
at org.springframework.data.hadoop.mapreduce.JobExecutor.startJobs(JobExecutor.java:164)
at org.springframework.data.hadoop.mapreduce.JobRunner.call(JobRunner.java:52)
at org.springframework.data.hadoop.mapreduce.JobRunner.call(JobRunner.java:33)
at org.springframework.data.hadoop.mapreduce.JobRunner.invoke(JobRunner.java:88)
at org.springframework.data.hadoop.mapreduce.JobRunner.call(JobRunner.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:616)
at org.springframework.util.MethodInvoker.invoke(MethodInvoker.java:273)
at org.springframework.scheduling.support.MethodInvokingRunnable.run(MethodInvokingRunnable.java:65)
at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:51)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:351)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:178)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:165)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:267)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:679)
Caused by: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory /ide-test/runner/output-4 already exists
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:137)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:973)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:416)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:550)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:580)
at org.springframework.data.hadoop.mapreduce.JobExecutor$2.run(JobExecutor.java:201)
... 22 more