Depending on what your specific situation is, I would consider a hybrid application, using a combination of SQL Azure or hosted SQL Server, a web page for the data collection piece, and a standard Access accdb FE running on the desktops of each of your users. It is true that there is a performance penalty involved when moving data down from the database, but if you are not gathering hundreds of inputs a day, or gathering large amounts of data, that should not have a huge impact overall.
If the performance is unacceptable for an Access FE linked to the remote database involved in things like aggregating data for reports, or analyzing that data, or whatever purpose you have for gathering it, then I would look at a variation of Pat's suggestion. You can link the webpage gathering data to a SQL Server or SQL Azure table for data collection. Once an hour, or once a day, or on whatever schedule you need, you can initiate a passthrough query from Access to retrieve any new records added since the last previous transfer. I.e. pulling records based on a date/time stamp to pull in only new records. Once those records are stored locally in an Access accdb BE, you are working at LAN speeds.
I designed exactly that sort of process for a client many years ago. We retrieved data once a day from a website that collected inquiries from potential clients. In that case, in fact, we didn't even control the SQL Server database. The website admin gave us read-only credentials to connect to one non-normalized table and retrieve the data we needed. We then parsed it into our local properly normalized tables.
Again, the key is latency in getting the data back from the website. If it has to be immediate, then this approach is possibly not viable. It really depends on whether you need data from 3 seconds ago, 30 seconds ago, or if data from 30 minutes ago is adequate.
All of this, though, is somewhat academic.
You can sign up for a free SQL Azure account and do some testing yourself.