开发者

file manipulation using tokenizer

import java.io.*;
import java.util.Scanner;
import java.util.StringTokenizer;

public class Filereader {
    public static void main(String[] args) throws IOException {

File fil = new File("trans1.txt");
FileReader inputFil = new FileReader(fil);
BufferedReader in = new BufferedReader(inputFil);

//Create scanner to read contents of a file
Scanner scanner = new Scanner(new File("trans1.txt"));

//Create an array to store contents of a file after reading them
String [] tall = new String [100];

int i = 0;

while(scanner.hasNext()){
//Create StringTokenizer object
   StringTokenizer st = new StringTokenizer(scanner.next(),"< , { } >", true);
   tall[i] = scanner.next();
   System.out.println(tall[i]);
   i++;
  }
    in.close();
  }
 }

I want to read contents of a file into an array using the code above and print the values stored in the array to screen. Each time i run the code i dont get the desired output,can someone please help. Contents of my file are as follows;

5 开发者_如何学编程
<0,{p1}>
<0,{p1}>  -1-> <1>
<1,{p1}>  -2-> <2><3>
<2,{p0}>  -1-> <0>
<3,{p1}>  -1-> <4>
<4,{p0}>  -1-> <3>


Here's how I'd do it:

package cruft;

import java.io.BufferedReader;
import java.io.File;
import java.io.FileReader;
import java.io.IOException;
import java.io.Reader;
import java.util.ArrayList;
import java.util.List;
import java.util.Scanner;
import java.util.StringTokenizer;

public class FileTokenizer
{
    private static final String DEFAULT_DELIMITERS = "< , { } >";
    private static final String DEFAULT_TEST_FILE = "data/trans1.txt";

    public static void main(String[] args)
    {
        try
        {
            String fileName = ((args.length > 0) ? args[0] : DEFAULT_TEST_FILE);
            FileReader fileReader = new FileReader(new File(fileName));
            FileTokenizer fileTokenizer = new FileTokenizer();
            List<String> tokens = fileTokenizer.tokenize(fileReader);
            System.out.println(tokens);
        }
        catch (IOException e)
        {
            e.printStackTrace();
        }
    }

    public List<String> tokenize(Reader reader) throws IOException
    {
        List<String> tokens = new ArrayList<String>();

        BufferedReader br = null;

        try
        {
            br = new BufferedReader(reader);
            Scanner scanner = new Scanner(br);
            while (scanner.hasNext())
            {
                StringTokenizer st = new StringTokenizer(scanner.next(), DEFAULT_DELIMITERS, true);
                while (st.hasMoreElements())
                {
                    tokens.add(st.nextToken());                    
                }
            }
        }
        finally
        {
            close(br);
        }

        return tokens;
    }

    public static void close(Reader r)
    {
        try
        {
            if (r != null)
            {
                r.close();
            }
        }
        catch (IOException e)
        {
            e.printStackTrace();
        }
    }
}

And here's the output I'd get if I run it against your file:

C:\JDKs\jdk1.6.0_13\bin\java  cruft.FileTokenizer
[5, <, 0, ,, {, p1, }, >, <, 0, ,, {, p1, }, >, -1-, >, <, 1, >, <, 1, ,, {, p1, }, >, -2-, >, <, 2, >, <, 3, >, <, 2, ,, {, p0, }, >, -1-, >, <, 0, >, <, 3, ,, {, p1, }, >, -1-, >, <, 4, >, <, 4, ,, {, p0, }, >, -1-, >, <, 3, >]

You don't say what output you expected.


First you should make clear, what your "desired output" is. Second, in the code you provided you don't even use your tokenizer, you only create it for no reason. Third, you call scanner.next() twice in the loop, probably unintentionally skipping half of the tokens. Fourth, why use both Scanner and StringTokenizer? I thought they were doing almost the same thing.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜