This is the traffic-cop of search engine bots and instructs those bots which parts of a website they should and should not follow.
While having a robots.txt file will not have any direct effect on your search engine rankings in the SERP'S(Search Engine Results Pages), the importance of this from a SEO vantage point should be obvious. Helping the search engine robots effectively navigate your site can only aid you in your SEO efforts by getting your webpages properly indexed in Google and the other search engines.
Creating the file is simple. Simply open any text editor like Notepad and copy and paste the following:
User-agent: *
Disallow:
Then save the file as robots.txt and upload this file into the root directory of your website.
The asterisk is a wildcard denoting all search engine robots and leaving the Disallow command blank allows them access to any parts of your site. If there are any areas of your website that for whatever reason you do not want the search engine spiders to visit simply place the file or folder name in the Disallow command like so: