Okay, new to the forums (thank you) and I have a simple, or seems simple question regarding the robots.txt file for web sites.

Based on my sample below, is it necessary to list the last user-agent that is to have access to all of the site in order for the whole site to be indexed? I mean, is the robots.txt file's contents hierarchical? Anyone that can answer this question for me I would appreciate. I have found nothing on the web that supports my question.

HTML Code:
User-agent: googlebot        # all services
Disallow: /private/          # disallow this directory
User-agent: googlebot-news   # only the news service
Disallow: /                  # on everything
User-agent: *                # all robots
Disallow: /something/        # on this directory