Sunday, 25 August 2013

Concurrent reading of multiple file took same time as normal reading

Concurrent reading of multiple file took same time as normal reading

I am processing a my web server access logs and store the processed
information into my db. Previously , I did as single threaded process. It
took long time to complete the process. I decided to go with concurrent
file reading to save the execution time. I achieved this using Executors
thread pool. Here is my java code.
Log File Handler
class FileHandler implements Runnable {
private File file;
public FileHandler(File file) {
this.file = file;
}
@Override
public void run() {
try {
byte[] readInputStream = readInputStream(new
FileInputStream(file));
} catch (IOException e) {
e.printStackTrace();
}
}
public static byte[] readInputStream(InputStream in) throws
IOException {
//closing the bytearrayoutput stream has no effect. @see java
doc.
ByteArrayOutputStream bos = null;
byte[] buffer = new byte[1024];
int bytesRead = -1;
bytesRead = in.read(buffer);
//no input to read.
if(bytesRead == -1) {
return null;
}
bos = new ByteArrayOutputStream(in.available()); //creating
output stream with approximate capacity.
bos.write(buffer , 0 , bytesRead);
try {
while((bytesRead = in.read(buffer)) != -1) {
bos.write(buffer , 0 , bytesRead);
}
}finally {
if(in != null) {
in.close();
}
}
return bos.toByteArray();
}
}
Concurrent File Reading
public class AccessLogProcessor {
public static void main(String[] args) {
String[] files = {
"/home/local/ZOHOCORP/bharathi-1397/Downloads/unique-invoice-zuid1.txt"
,
"/home/local/ZOHOCORP/bharathi-1397/Downloads/unique-invoice-zuid.txt"
};
long start = System.currentTimeMillis();
ExecutorService executors =
Executors.newFixedThreadPool(files.length);
for(String file : files) {
executors.execute(new FileHandler(new File(file)));
}
executors.shutdown();
while(!executors.isTerminated());
System.out.println("Time Taken by concurrent reading ::
"+(System.currentTimeMillis()-start) + " ms ");
}
}
Single Threaded File Reading
public class Test {
public static void main(String[] args) throws
FileNotFoundException, IOException {
String[] files = {
"/home/local/ZOHOCORP/bharathi-1397/Downloads/unique-invoice-zuid1.txt"
,
"/home/local/ZOHOCORP/bharathi-1397/Downloads/unique-invoice-zuid.txt"
};
long start = System.currentTimeMillis();
for(String file : files) {
FileHandler.readInputStream(new FileInputStream(file));
}
System.out.println("Time Taken by concurrent reading ::
"+(System.currentTimeMillis()-start) + " ms ");
}
}
Test Result for 10 rounds of execution
Single Thread Execution : 9ms.
Concurrent Execution : 14ms.
I am reading files in concurrently but why the timetaken is greater than
single threaded execution?. Please correct me If I did anything wrong?.

No comments:

Post a Comment