diff --git a/docs/cluster/oge.md b/docs/cluster/oge.md
index 420bc52359b5bc959a43c78179d845e5aebc5d0a..82397b58cbdaaea218d3055a86d4832fa2467b45 100644
--- a/docs/cluster/oge.md
+++ b/docs/cluster/oge.md
@@ -6,10 +6,10 @@ different users in a fair way. So Resources are shared equally between multiple
 
 # Sun Grid Engine
 
-Oracle Grid Engine or Sun Grid Engine is a computer cluster software sytem otherwise also known as a batch-queing system. These
+Oracle Grid Engine or Sun Grid Engine is a computer cluster software system also known as a batch-queing system. These
  systems help the computer cluster users to distribute and fairly schedule the jobs to the different computers in the cluster.
 
 # Open Grid Engine
 
-The Open Grid Engine (OGE) is based on the SunGridEngine but completely open source. It does support commercially batch-queuing
+The Open Grid Engine (OGE) is based on the SunGridEngine but is completely open source. It does support commercially batch-queuing
  systems.
\ No newline at end of file
diff --git a/docs/developer/example-pipeable.md b/docs/developer/example-pipeable.md
index b310008662e9c93451c37bc951c1101816824185..9604c8fb6f165fd2f5c027bf4aa794b9013493cb 100644
--- a/docs/developer/example-pipeable.md
+++ b/docs/developer/example-pipeable.md
@@ -3,7 +3,7 @@
 ## Introduction
 
 Since the release of Biopet v0.5.0 we support piping of programs/tools to decrease disk usage and run time. Here we make use of
- [fifo piping](http://www.gnu.org/software/libc/manual/html_node/FIFO-Special-Files.html#FIFO-Special-Files). Which enables a 
+ [fifo piping](http://www.gnu.org/software/libc/manual/html_node/FIFO-Special-Files.html#FIFO-Special-Files), which enables a 
  developer to very easily implement piping for most pipeable tools.
  
 ## Example
@@ -21,8 +21,8 @@ Since the release of Biopet v0.5.0 we support piping of programs/tools to decrea
 * In the above example we define the variable ***pipe***. This is the place to define which jobs should be piped together. In 
 this case
  we perform a zcat on the input files. After that GSNAP alignment and Picard reordersam is performed. The final output of this 
- job will be a SAM file all intermediate files will be removed as soon as the job finished completely without any error codes.
-* With the second command pipe.threadsCorrection = -1 we make sure the total number of assigned cores is not to high. This 
+ job will be a SAM file. All intermediate files will be removed as soon as the job finished completely without any error codes.
+* With the second command pipe.threadsCorrection = -1 we make sure the total number of assigned cores is not too high. This 
 ensures that the job can still be scheduled to the compute cluster.
 * So we hope you can appreciate in the above example that we decrease the total number of assigned cores with 2. This is done 
 by the command ***zcatR1._1.foreach(x => pipe.threadsCorrection -= 1)***
diff --git a/docs/developer/example-tool.md b/docs/developer/example-tool.md
index 2c02efdbb6f01aa781234371e30ea40e1727dd8f..ce3886658f6609a3fd0e4e8e98f9cf2b23bafc04 100644
--- a/docs/developer/example-tool.md
+++ b/docs/developer/example-tool.md
@@ -28,14 +28,14 @@ object SimpleTool extends ToolCommand {
 ```
 
 This is the minimum setup for having a working tool. We will place some code for line counting in ``main``. Like in other 
-higher order programming languages like Java, C++, .Net. One need to specify an entry for the program to run. ``def main``
-is here the first entrypoint from commandline into your tool.
+higher order programming languages like Java, C++ and .Net, one needs to specify an entry for the program to run. ``def main``
+is here the first entry point from the command line into your tool.
 
 
 ### Program arguments and environment variables
 
 A basic application/tool usually takes arguments to configure and set parameters to be used within the tool.
-In biopet we facilitate an ``AbstractArgs`` case-class which stores the arguments read from commandline.
+In biopet we facilitate an ``AbstractArgs`` case-class which stores the arguments read from command line.
 
 
 ```scala
@@ -67,8 +67,8 @@ Consuming and placing values in `Args` works as follows:
   }
 ```
 
-One has to implement class `OptParser` in order to fill `Args`. In `OptParser` one defines the commandline args and how it should be processed.
- In our example, we just copy the values passed on the commandline. Further reading: [scala scopt](https://github.com/scopt/scopt)
+One has to implement class `OptParser` in order to fill `Args`. In `OptParser` one defines the command line args and how it should be processed.
+ In our example, we just copy the values passed on the command line. Further reading: [scala scopt](https://github.com/scopt/scopt)
 
 Let's compile the code into 1 file and test with real functional code:
 
@@ -151,7 +151,7 @@ object SimpleTool extends ToolCommand {
 
 In order to use this tool within biopet, one should write an `extension` for the tool. (as we also do for normal executables like `bwa-mem`)
  
-The wrapper would look like this, basicly exposing the same commandline arguments to biopet in an OOP format.
+The wrapper would look like this, basically exposing the same command line arguments to biopet in an OOP format.
 Note: we also add some functionalities for getting summary data and passing on to biopet.
 
 The concept of having (extension)-wrappers is to create a black-box service model. One should only know how to interact with the tool without necessarily knowing the internals.