Help, guys! I submitted my site a month ago. I just checked my logs, and I can see that
there have been 40 requests for robots.txt file, where they check that file, and nothing else.
After checking the file, they go away. Is this normal? Do search engines just check it, then come back later to crawl?
Here's my robots.txt file:

User-agent: *
Disallow: /hid/
Disallow: /images/
Allow: /
Disallow:

Anything wrong with this?

Note:
The "Allow" is there to invite the engines to crawl everything.
Anybody know if this works?