Tag Archives: GIT

java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.TaskInputOutputContext, but class was expected

While writing the unit java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.TaskInputOutputContext, but class was expected test cases for my
application I was getting the below mentioned exception during the execution of these test cases. Below is the  complete exception trace :

java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.TaskInputOutputContext, but class was expected
	at org.apache.hadoop.mrunit.internal.mapreduce.AbstractMockContextWrapper.createCommon(AbstractMockContextWrapper.java:59)
	at org.apache.hadoop.mrunit.internal.mapreduce.MockMapContextWrapper.create(MockMapContextWrapper.java:77)
	at org.apache.hadoop.mrunit.internal.mapreduce.MockMapContextWrapper.(MockMapContextWrapper.java:68)
	at org.apache.hadoop.mrunit.mapreduce.MapDriver.getContextWrapper(MapDriver.java:167)
	at org.apache.hadoop.mrunit.mapreduce.MapDriver.getContext(MapDriver.java:198)
	at com.techidiocy.integratekeys.mapreduce.test.TestIntegrationKeysMapper.init(TestIntegrationKeysMapper.java:37)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:27)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:263)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:69)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:48)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:231)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:60)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:50)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:222)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:292)
	at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:50)
	at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
	at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:467)
	at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683)
	at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390)
	at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197)

Continue reading java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.TaskInputOutputContext, but class was expected

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

org.datanucleus.store.rdbms.exceptions.MappedDatastoreException: INSERT INTO “TABLE_PARAMS” – Hive with Kite Morphlines

rdbms.exceptions.MappedDatastoreException: INSERT INTO “TABLE_PARAMS”

If you had read my previous post (Anatomy of a configuration file) where i hive_logo
described the flow of the application
on which I was working . In the last
command of my morphline I was
creating the hive table using the avro
schema which was created in the previous command.
As per the design of my application I
decided that i will store my avro schema
file in the local file system instead of saving it on HDFS. Everything was working fine as expected , but things started breaking when my avro schema file contains more than 4000 characters. Below is the complete exception which I got while Hive table creation :

org.datanucleus.store.rdbms.exceptions.MappedDatastoreException: INSERT INTO “TABLE_PARAMS” 
Continue reading org.datanucleus.store.rdbms.exceptions.MappedDatastoreException: INSERT INTO “TABLE_PARAMS” – Hive with Kite Morphlines

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

java.io.IOException: can not read class parquet.format.PageHeader: null – Hive

While evaluating the Cloudera Kite Morphlines , java.io.IOException: can not read class parquet.format.PageHeader:
I came across this exception while reading the
data from the table.

java.io.IOException: can not read class
parquet.format.PageHeader: null
.

Before going ahead let me give you the background what I am trying to do here.
I am building an application where external client will upload input XML files and there corresponding XSDs ,once these files are uploaded a job will run that will unmarshall these XML files into Java objects , later on these these java objects will be passed to Drools Framework where validation and minor transformations will be performed on this data. During this Continue reading java.io.IOException: can not read class parquet.format.PageHeader: null – Hive

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Anatomy of a Command Builder with Example – Cloudera Kite Morphlines

Cloudera Kite Morphlines

In the last post we have seen the internals Cloudera Kite Morphlines
of a configuration file also known as
morphline. In this post we are going to
explore the actual code that does all the
job in the background. It doesn’t make a
difference whether you are using the in
built command (bundled with Cloudera Kite Morphlines SDK) or writing your own custom command , basic structure and semantics of all the commands are same.

All the commands in the Cloudera Kite Morphlines implements

org.kitesdk.morphline.api.CommandBuilder

interface. This interface contains 2 methods for which you have to provide the implementation in your CommandBuilder implementation. Continue reading Anatomy of a Command Builder with Example – Cloudera Kite Morphlines

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Single Field Indexes MongoDB Java Example

Indexing MongoDB Java Example

Indexing in MongoDB works in theIndexing in MongoDB same way as it is used to work in relational databases.It helps in the fast retrieval of documents available in MongDB . Having an index over a collection ensures that least numbers of documents are scanned to find the documents matching search criteria. Hence having an understanding of indexes is very important for the efficient performance of your application. In this post we will cover: Continue reading Single Field Indexes MongoDB Java Example

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

How to add existing projects to Github ?

Version control is an importantUnderstand git clone , svn checkout Vs git clone aspect of any application development, it allows you to access your code base from different machines ,allows you to make lot of mistakes and then correct them before delivering a final product ,allows you to track the progress of your project and much more. If you are reading this post I am assuming that you are having a project that you want to move on Github. Continue reading How to add existing projects to Github ?

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Write Concern MongoDB Performance Comparison

In the last post we have MongoDB Official Logoseen the difference among all the available write concerns in MongoDB. We have also seen that which write concern to choose in which scenario that mostly depends on the type of data you are dealing with.

In this post we will see a performance comparison of all the write concerns , in this test i have inserted 100K records for each type of write concerns and captured the time taken for them.

Below are the stats that I got after executing my test case. Continue reading Write Concern MongoDB Performance Comparison

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Update Multiple Documents MongoDB Java Example

This post will talk about Update Multiple Documents MongoDB Java Example.
In the last few examples MongoDB Official Logowe have seen that, we are using the default version of update() method , where it takes 2 arguments i.e. search criteria object and modified object. Whenever this version of update() method is executed it updates only Continue reading Update Multiple Documents MongoDB Java Example

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Bare and Non Bare Repositories , .git directory and working tree

Understand git clone , svn checkout Vs git cloneThese days I am busy in learning GIT distributed versioning system as our project is going to be sit on it, so I am reading a lot and trying to understand it so that I can take full advantage of it in my project.In my previous post we have seen the working of git clone command and how it is different from svn checkout.
In this post we are going to see some differences between bare and non bare repositories, which one to use when.
A normal GIT Repository basically made up of 2 components –

1. Working directory : In simple terms it is the place where all checked out files reside and file available in this location can be added to git repository by

git add FILE_NAME

.2. .git Directory : In simple terms this directory contains all the administrative related files and data that are required by GIT to support all the commands like git commit, git status , git pull, git push. This directory is always present in the root of your project directory.

So , let’s start looking into the bare and non bare repositories Continue reading Bare and Non Bare Repositories , .git directory and working tree

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Understand git clone command, difference between svn checkout and git clone

Understand git clone , svn checkout Vs git cloneRecently our project migrated from SVN to GIT Distributed Version Control and later I came to know that it has lot of advantages over SVN. So I started looking into GIT basic terminology like clone , commit , pull , push fetch , rebase , merge , branches and honestly speaking earlier I was very confused but as soon I started going into the depth I found it interesting and something that I have never worked till now. I read lot of books , lot of blogs on basic GIT terminologies and here I am going to write my findings on same, so that it can help newbies like me and most importantly for my future reference.

So whole story begins with how to get the code into your local system Continue reading Understand git clone command, difference between svn checkout and git clone

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS