Look at the docs for the File constructor you're calling. The only exception it's declared to throw is NullPointerException. Therefore it can't throw FileNotFoundException, which is why you're getting the error. You can't try to catch a checked exception which the compiler can prove is never thrown within the corresponding try block.
Creating a File object doesn't check for its existence. If you were opening the file (e.g. with new FileInputStream(...) then that could throw FileNotFoundException... but not just creating a File object.
Look at the docs for the File constructor you're calling. The only exception it's declared to throw is NullPointerException. Therefore it can't throw FileNotFoundException, which is why you're getting the error. You can't try to catch a checked exception which the compiler can prove is never thrown within the corresponding try block.
Creating a File object doesn't check for its existence. If you were opening the file (e.g. with new FileInputStream(...) then that could throw FileNotFoundException... but not just creating a File object.
This is because the constructor of class File with one argument
public File(String pathname)
Parameters:pathname - A pathname string Throws: NullPointerException - If the pathname argument is null
Throws: NullPointerException - If the pathname argument is null
throws only one exception and that is NullPointerException. Your code tries to catch a FileNotFoundException which is not related to NullPointerException and this is why you get this error in Eclipse.
One way to go is to catch exceptions of class Exception which is the super class of all exceptions in Java. Another way is to catch all the exceptions (each in different catch block) that the invoked construct throws (which can be easily obtained by going through its API). The third approach is to catch only the exceptions (again which are actually thrown by the construct) that make sense to your application and ignore the others.
Videos
use apache commons io
http://commons.apache.org/proper/commons-io/
look at their FileUtils class. Full of gold. Gold I say....
This is not the standard way at all. This is the bad way.
The way I use most of the time is this one :
ObjectOutputStream out = null;
try {
out = new ObjectOutputStream(new FileOutputStream("file.dat"));
// use out
}
finally {
if (out != null) {
try {
out.close();
}
catch (IOException e) {
// nothing to do here except log the exception
}
}
}
The code in the finally block can be put in a helper method, or you can use commons IO to close the stream quietly, as noted in other answers.
A stream must always be closed in a finally block.
Note that JDK7 will make it much easier with the new syntax, which will automatically close the stream at the end of the try block :
try (ObjectOutputStream out = new ObjectOutputStream(new FileOutputStream("file.dat"))) {
// use out
}
There is no simple answer. Put yourself in the role of the user and think what they would expect, if there is a data file that is corrupted in the middle.
Let's say I have an address book with 1200 addresses, and there is one that your code cannot read. As a user, I expect to see 1199 addresses. Do I even expect an error message? I don't think so. Or at most once. Because if I use an application and every single time I search for an address I get a bloody error message I will be mightily pissed off.
Let's say my application just received a file with financial data. Say information about 217 bills that my company has to pay, and one that your code cannot read. As a user, I expect to be told that this file is corrupted, so that I can get back to the people sending the file and get a new one. Ignoring a bill that has been corrupted would be very, very bad and could lead to dire consequences.
So you see: It depends. Look at the situation and do whatever makes sense. Do it not in terms of a software developer, but from the point of view of the end user who needs the most useful results.
If you're expecting dirty data:
Dirty data is when some of the records may be dirty or not correct. The quality of data may not be high. In this case, just push the bad records into a skip file, log it and continue forward to the next one. At the end of processing, create a notification with a summary of results and note the skip file location.
This way one can process the majority of records and note the dirty ones for further remediation.
If you're expecting clean data:
Clean data means you expect every record to be valid. If the data is not clean, stop processing and create an alert. This scenario will require a pre-screen (first pass) of every record in the file to determine if it is valid or not. If the records are all valid, then the file is good and one can process. If the file is not valid, most likely the entire file is discarded and whoever/whatever is generating the file will have to create a new one to process.
Depending on your requirements, either option is acceptable.