I may not have the answer or even a helpful suggestion here, but taken from what human analytics say, a 'list' that contains more than 8 or 9 'items' becomes tedious and dismissive. If a list need contain more than 8 or 9 items, either a 'new list' is needed, or some sort of sub-list should be used or considered.
Are all these links that you have similar? Can they be 'grouped' by some larger criteria, like "location" or "region", "subject" or "agenda" or by "medium affordability" and "expensive" etc? I find it hard to believe that 200 'links' can't be broken down into at least 5 or 6 sub-categories, somehow.
As for 'too many links harms SEO', -I think the judge is still out to lunch on that one. Everything that I have read suggests that this might be/could be a problem, someday, under certain conditions. It is like those Ads you see for "we guarantee to get your site tons of traffic"... what they do is (after charging you a nominal fee of course) is insert your URL 'inline' on a page that contains nothing but textual targeted (sometimes HEADER) keywords.
Example: your site is "http://example.com"
You 'hire them' to do their service, and your URL gets stuffed into some 'deep web' page that looks like this:
....SERVICE. EFFICIENT. CUSTOMER LOYALTY, RESEARCH, RESULTS. COMMITMENT, AGENDA, SEMANTICS
REPEAT CUSTOMERS. CUSTOMER_SATISFACTION, BUSINESS. http://example.com SATISFACTION, SEO, QUALITY, MARKETABILITY,
INVESTMENT, INTUITIVE; MARKETING, DATABASE, IT, REPRESENTATIVE....
except the KEYWORDS on the page number in the hundreds or thousands. Often, several URLs from un-related companies with similar goals are stuffed onto the same page. I used to have compiled examples of this practice, for show & tell in the CoffeeLounge. I think that this is what "too many links in a web page?" is the crux concern over... someone's ability to sneak a black-hat trick into the web and create meaningless backlinks.
And oh, -yes, the method above MIGHT get you 1000+ unique views per day as they claim, but if you ran an analytic upon that traffic you would see that the user only stayed on the page for 'less than a minute/indeterminable' meaning that they 'backed up' immediately. They realized a bad jump, that they landed upon a 'spammish' page... again, 'too many links' might be the next generation of trickery.
Interim; if the page contains more than 10% by kb-weight of 'just links', it probably smells like hot spam to a googlebot no matter how legit the page really is...