How to get web page content in a excel file.?
-
Hi, I am working on a Java project. I want to get the content of a specific URL in an excel file. i:e; Go to a given URL. there are different fields defined in the webpage, i:e: Name, ID, Address etc. I want to fetch these fields and store this information in an excel file. Any help will be appreciated. Regards, mba
-
Hi, I am working on a Java project. I want to get the content of a specific URL in an excel file. i:e; Go to a given URL. there are different fields defined in the webpage, i:e: Name, ID, Address etc. I want to fetch these fields and store this information in an excel file. Any help will be appreciated. Regards, mba
-
mbatra31 wrote:
Any help will be appreciated.
Help with what? You need to show what you have tried and what problems you are having.
Hi, I am able to get the data from the web page into .html file or .txt file. But the problem is how to filter the data and get specific information which I want. for example the data is in a .txt file spread over multiple lines. How to retrieve the specific fields from the file. Here I am attaching the code which I have used:
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileWriter;
import java.io.IOException;
import java.io.InputStreamReader;
import java.net.MalformedURLException;
import java.net.URL;
import java.net.URLConnection;public class GetContent {
public static void main(String[] args) {URL url; try { // get URL content url = new URL("http://www.mims.com"); URLConnection conn = url.openConnection(); // open the stream and put it into BufferedReader BufferedReader br = new BufferedReader( new InputStreamReader(conn.getInputStream())); String inputLine; //save to this filename String fileName = "E:\\\\sample.txt"; File file = new File(fileName); if (!file.exists()) { file.createNewFile(); } //use FileWriter to write file FileWriter fw = new FileWriter(file.getAbsoluteFile()); BufferedWriter bw = new BufferedWriter(fw); while ((inputLine = br.readLine()) != null) { bw.write(inputLine); } bw.close(); br.close(); } catch (MalformedURLException e) { e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } }
}
-
Hi, I am able to get the data from the web page into .html file or .txt file. But the problem is how to filter the data and get specific information which I want. for example the data is in a .txt file spread over multiple lines. How to retrieve the specific fields from the file. Here I am attaching the code which I have used:
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileWriter;
import java.io.IOException;
import java.io.InputStreamReader;
import java.net.MalformedURLException;
import java.net.URL;
import java.net.URLConnection;public class GetContent {
public static void main(String[] args) {URL url; try { // get URL content url = new URL("http://www.mims.com"); URLConnection conn = url.openConnection(); // open the stream and put it into BufferedReader BufferedReader br = new BufferedReader( new InputStreamReader(conn.getInputStream())); String inputLine; //save to this filename String fileName = "E:\\\\sample.txt"; File file = new File(fileName); if (!file.exists()) { file.createNewFile(); } //use FileWriter to write file FileWriter fw = new FileWriter(file.getAbsoluteFile()); BufferedWriter bw = new BufferedWriter(fw); while ((inputLine = br.readLine()) != null) { bw.write(inputLine); } bw.close(); br.close(); } catch (MalformedURLException e) { e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } }
}