Exploring the sample plugins for PDI

Dear Kettle enthusiasts,

this is just a short blog post to point out that there are some new development resources available: Pentaho has added several sample plugins for PDI to the public CI-Server. If you ever wondered how to create your own plugins for PDI, check out the samples!

The following artifacts are available:

Step plugin http://ci.pentaho.com/job/kettle-sdk-step-plugin/
Job entry plugin http://ci.pentaho.com/job/kettle-sdk-jobentry-plugin/
Database plugin http://ci.pentaho.com/job/kettle-sdk-database-plugin/

The .zip artifacts contain an Eclipse-importable project that you can use to kickstart your own plugin development efforts.
I’ve put together a short screencast that shows how to import the job entry plugin into Eclipse, deploy it, and debug its code in Eclipse.

Be sure to watch in HD and enable annotations!

To help understanding the sample code you may want to check the plugin development tutorials:

Developing a custom Kettle Plugin: A Simple Transformation Step
Developing a custom Kettle Plugin: Looking up values in Voldemort
Developing a Custom Kettle Job Entry Plugin: Triggering a Report on JasperServer

Have fun developing your own plugins!

Cheers
Slawo

2 comments to Exploring the sample plugins for PDI

  • Hi,

    I am working with Pentaho Data Integration Community.
    Hi, I have a question about Pentaho Data Integration (Community)
    I am creating a transformation and I create a database connection to a MySQL database. The connection works well, but when I create a second transformación and I am using a database step I can’t select the database connection that I have created for the first transformation.
    It seems like a database connection is related to a transformatiob, but how can I create a database connection for all objects of my session?
    For example, I can select AgileBI (the connection of the installation) from every transformation or jobs

    Thanks

  • Hi Slawo,

    I am tryng to read a file from HDFS in Hadoop from Pentaho DI and I am getting problems:

    - Pentaho DI (open source) in a local machine Win 7 version 6.1
    - HDFS in a Virtual Machine Cloudera Quick Start 5.4

    I have indicated my Hadoop Distribution (cdh55) in Tools, I have created a cluster in ‘view’, when I test the cluster it works properly.

    Then I create a Hadoop File Input step, I can browse in the cluster and select a File, I specified the stributes (Unix systemes, delimitier..) but I have problems when trying to read it. The following error:

    ……
    at org.pentaho.commons.launcher.Launcher.main (Launcher.java:92)
    at java.lang.reflect.Method.invoke (null:-1)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (null:-1)
    at sun.reflect.NativeMethodAccessorImpl.invoke (null:-1)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (null:-2)
    at org.pentaho.di.ui.spoon.Spoon.main (Spoon.java:662)
    at org.pentaho.di.ui.spoon.Spoon.start (Spoon.java:9269)
    at org.pentaho.di.ui.spoon.Spoon.waitForDispose (Spoon.java:7989)
    at org.pentaho.di.ui.spoon.Spoon.readAndDispatch (Spoon.java:1347)
    at org.eclipse.swt.widgets.Display.readAndDispatch (null:-1)
    at org.eclipse.swt.widgets.Display.runDeferredEvents (null:-1)
    at org.eclipse.swt.widgets.Widget.sendEvent (null:-1)
    at org.eclipse.swt.widgets.EventTable.sendEvent (null:-1)
    at org.eclipse.jface.action.ActionContributionItem$5.handleEvent (ActionContributionItem.java:402)
    at org.eclipse.jface.action.ActionContributionItem.access$2 (ActionContributionItem.java:490)
    at org.eclipse.jface.action.ActionContributionItem.handleWidgetSelection (ActionContributionItem.java:545)
    at org.eclipse.jface.action.Action.runWithEvent (Action.java:498)
    at org.pentaho.ui.xul.jface.tags.JfaceMenuitem$1.run (JfaceMenuitem.java:106)
    at org.pentaho.ui.xul.jface.tags.JfaceMenuitem.access$100 (JfaceMenuitem.java:43)
    at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke (AbstractXulComponent.java:141)
    at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke (AbstractXulComponent.java:157)
    at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke (AbstractXulDomContainer.java:313)
    at java.lang.reflect.Method.invoke (null:-1)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (null:-1)
    at sun.reflect.NativeMethodAccessorImpl.invoke (null:-1)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (null:-2)
    at org.pentaho.di.ui.spoon.trans.TransGraph.editStep (TransGraph.java:2129)
    at org.pentaho.di.ui.spoon.trans.TransGraph.editStep (TransGraph.java:3072)
    at org.pentaho.di.ui.spoon.Spoon.editStep (Spoon.java:8783)
    at org.pentaho.di.ui.spoon.delegates.SpoonStepsDelegate.editStep (SpoonStepsDelegate.java:125)
    at org.pentaho.big.data.kettle.plugins.hdfs.trans.HadoopFileInputDialog.open (HadoopFileInputDialog.java:575)
    at org.eclipse.swt.widgets.Display.readAndDispatch (null:-1)
    at org.eclipse.swt.widgets.Display.runDeferredEvents (null:-1)
    at org.eclipse.swt.widgets.Widget.sendEvent (null:-1)
    at org.eclipse.swt.widgets.EventTable.sendEvent (null:-1)
    at org.pentaho.big.data.kettle.plugins.hdfs.trans.HadoopFileInputDialog$3.handleEvent (HadoopFileInputDialog.java:482)
    at org.pentaho.big.data.kettle.plugins.hdfs.trans.HadoopFileInputDialog.access$200 (HadoopFileInputDialog.java:125)
    at org.pentaho.big.data.kettle.plugins.hdfs.trans.HadoopFileInputDialog.first (HadoopFileInputDialog.java:2634)
    at org.pentaho.big.data.kettle.plugins.hdfs.trans.HadoopFileInputDialog.getFirst (HadoopFileInputDialog.java:2722)
    at org.pentaho.di.trans.steps.textfileinput.TextFileInput.getLine (TextFileInput.java:97)
    at org.pentaho.di.trans.steps.textfileinput.TextFileInput.getLine (TextFileInput.java:127)
    at java.io.InputStreamReader.read (null:-1)
    at sun.nio.cs.StreamDecoder.read (null:-1)
    at sun.nio.cs.StreamDecoder.read0 (null:-1)
    at sun.nio.cs.StreamDecoder.read (null:-1)
    at sun.nio.cs.StreamDecoder.implRead (null:-1)
    at sun.nio.cs.StreamDecoder.readBytes (null:-1)
    at org.pentaho.di.core.compress.CompressionInputStream.read (CompressionInputStream.java:68)
    at org.apache.commons.vfs2.util.MonitorInputStream.read (MonitorInputStream.java:99)
    at java.io.BufferedInputStream.read (null:-1)
    at java.io.BufferedInputStream.read1 (null:-1)
    at java.io.DataInputStream.read (null:-1)
    at org.apache.hadoop.hdfs.DFSInputStream.read (DFSInputStream.java:903)
    at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy (DFSInputStream.java:851)
    at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo (DFSInputStream.java:624)
    at org.apache.hadoop.hdfs.BlockReaderFactory.build (BlockReaderFactory.java:374)
    at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp (BlockReaderFactory.java:753)
    at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer (BlockReaderFactory.java:838)
    at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer (DFSClient.java:3492)
    at org.apache.hadoop.net.NetUtils.connect (NetUtils.java:530)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect (SocketIOWithTimeout.java:192)
    at sun.nio.ch.SocketChannelImpl.connect (null:-1)
    at sun.nio.ch.Net.checkAddress (null:-1)

    at org.pentaho.commons.launcher.Launcher.main (Launcher.java:92)
    at java.lang.reflect.Method.invoke (null:-1)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (null:-1)
    at sun.reflect.NativeMethodAccessorImpl.invoke (null:-1)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (null:-2)
    at org.pentaho.di.ui.spoon.Spoon.main (Spoon.java:662)
    at org.pentaho.di.ui.spoon.Spoon.start (Spoon.java:9269)
    at org.pentaho.di.ui.spoon.Spoon.waitForDispose (Spoon.java:7989)
    at org.pentaho.di.ui.spoon.Spoon.readAndDispatch (Spoon.java:1347)
    at org.eclipse.swt.widgets.Display.readAndDispatch (null:-1)
    at org.eclipse.swt.widgets.Display.runDeferredEvents (null:-1)
    at org.eclipse.swt.widgets.Widget.sendEvent (null:-1)
    at org.eclipse.swt.widgets.EventTable.sendEvent (null:-1)
    at org.eclipse.jface.action.ActionContributionItem$5.handleEvent (ActionContributionItem.java:402)
    at org.eclipse.jface.action.ActionContributionItem.access$2 (ActionContributionItem.java:490)
    at org.eclipse.jface.action.ActionContributionItem.handleWidgetSelection (ActionContributionItem.java:545)
    at org.eclipse.jface.action.Action.runWithEvent (Action.java:498)
    at org.pentaho.ui.xul.jface.tags.JfaceMenuitem$1.run (JfaceMenuitem.java:106)
    at org.pentaho.ui.xul.jface.tags.JfaceMenuitem.access$100 (JfaceMenuitem.java:43)
    at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke (AbstractXulComponent.java:141)
    at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke (AbstractXulComponent.java:157)
    at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke (AbstractXulDomContainer.java:313)
    at java.lang.reflect.Method.invoke (null:-1)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (null:-1)
    at sun.reflect.NativeMethodAccessorImpl.invoke (null:-1)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (null:-2)
    at org.pentaho.di.ui.spoon.trans.TransGraph.editStep (TransGraph.java:2129)
    at org.pentaho.di.ui.spoon.trans.TransGraph.editStep (TransGraph.java:3072)
    at org.pentaho.di.ui.spoon.Spoon.editStep (Spoon.java:8783)
    at org.pentaho.di.ui.spoon.delegates.SpoonStepsDelegate.editStep (SpoonStepsDelegate.java:125)
    at org.pentaho.big.data.kettle.plugins.hdfs.trans.HadoopFileInputDialog.open (HadoopFileInputDialog.java:575)
    at org.eclipse.swt.widgets.Display.readAndDispatch (null:-1)
    at org.eclipse.swt.widgets.Display.runDeferredEvents (null:-1)
    at org.eclipse.swt.widgets.Widget.sendEvent (null:-1)
    at org.eclipse.swt.widgets.EventTable.sendEvent (null:-1)
    at org.pentaho.big.data.kettle.plugins.hdfs.trans.HadoopFileInputDialog$3.handleEvent (HadoopFileInputDialog.java:482)
    at org.pentaho.big.data.kettle.plugins.hdfs.trans.HadoopFileInputDialog.access$200 (HadoopFileInputDialog.java:125)
    at org.pentaho.big.data.kettle.plugins.hdfs.trans.HadoopFileInputDialog.first (HadoopFileInputDialog.java:2634)
    at org.pentaho.big.data.kettle.plugins.hdfs.trans.HadoopFileInputDialog.getFirst (HadoopFileInputDialog.java:2722)
    at org.pentaho.di.trans.steps.textfileinput.TextFileInput.getLine (TextFileInput.java:97)
    at org.pentaho.di.trans.steps.textfileinput.TextFileInput.getLine (TextFileInput.java:127)
    at java.io.InputStreamReader.read (null:-1)
    at sun.nio.cs.StreamDecoder.read (null:-1)
    at sun.nio.cs.StreamDecoder.read0 (null:-1)
    at sun.nio.cs.StreamDecoder.read (null:-1)
    at sun.nio.cs.StreamDecoder.implRead (null:-1)
    at sun.nio.cs.StreamDecoder.readBytes (null:-1)
    at org.pentaho.di.core.compress.CompressionInputStream.read (CompressionInputStream.java:68)
    at org.apache.commons.vfs2.util.MonitorInputStream.read (MonitorInputStream.java:99)
    at java.io.BufferedInputStream.read (null:-1)
    at java.io.BufferedInputStream.read1 (null:-1)
    at java.io.DataInputStream.read (null:-1)
    at org.apache.hadoop.hdfs.DFSInputStream.read (DFSInputStream.java:903)
    at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy (DFSInputStream.java:851)
    at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo (DFSInputStream.java:624)
    at org.apache.hadoop.hdfs.BlockReaderFactory.build (BlockReaderFactory.java:374)
    at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp (BlockReaderFactory.java:753)
    at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer (BlockReaderFactory.java:838)
    at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer (DFSClient.java:3492)
    at org.apache.hadoop.net.NetUtils.connect (NetUtils.java:530)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect (SocketIOWithTimeout.java:192)
    at sun.nio.ch.SocketChannelImpl.connect (null:-1)
    at sun.nio.ch.Net.checkAddress (null:-1)

    …..

    I am not sure if the problem is in Pentaho or HDFS. I have checked HDFS service ans is up and running, also Zookeeper and other Hadoop services. I have created the connection with the user of Cloudera.

    Can you help me?, ant advice will be greatly aprecciated.

    Thanks in advance,

    Juan

Leave a Reply

 

 

 

You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>