Well behaved search engines and bots look for a robots.txt file to determine what should and should not be indexed. By default everything will be scanned before their own algorithms determine what they want to list, but even if you want everything indexed creating a robots.txt file is a good idea.
To create a robots.txt file which tells search engines to index everything on your site simply open notepad or notepad++ (don’t use Word!), create a plain text tile and add the following code:
User-agent: * Allow: /