what's a simple way to search a very long text file for a word in python?
I want to create a very开发者_JS百科 simple spell checker using a text file with an alphabetized list of about 80000 common words.
What's a simple but efficient way to search the file and find out if it contains a word using python?
Should I parse the word list file into a set, list, dictionary, tuple?
Is there an easy way to take advantage of the fact that my word list is already alphabetized?
I'd prefer to keep it relatively simple. I don't want corrected spelling recommendations or other fancy features. I just want to know if the word is miss-spelled.
Since 80000 words will easily fit in memory, you are best off using a set
:
words = set(line.strip() for line in open("words"))
This won't take advantage of the fact that your file is already sorted, but it is the most efficient way anyway. To look up a word w
, you can use
w in words
which is amortised O(1).
Put your dict words in a set which has a constant lookup time.
myDict = set([<actual list of words here>])
for word in file:
if word not in myDict:
handleBadWord(word)
精彩评论