On Mon, 27 Jul 2009 22:05:41 +0300, Anakim Admin <anakim_at_gmail.com> wrote:
> Hi,
>
> We are planning to implement a solution where we host user photos and
> video files like flickr does. And, as a typical solution, we are
> planning to host these files on a NAS, put an apache server in front
> of this NAS and place a squid proxy cache as the entry point. Our
> application will be hosted on application servers, and it must have a
> different domain name, i.e. application will be accessed via x.com and
> squid (and hence multimedia files) will be accessed via y.com, so
> there is no way of sharing authentication state and/or cookies and I
> don't want users to double authenticate for usability reasons.
I think this may be resolved with basic auth
Or alternatively with an external_acl_type helper doing cookie decode and
setting back a username
(helper result: "OK user=foo").
Both of which squid can pass back to the server using 'login=PASS' on the
cache_peer line.
>
> But, some of these photos and videos need to be protected, for example
> a user might choose to limit access to a photo by his friends only.
>
> I looked at how flickr does it, and they do something like
> authorization by URL. I mean, the URL is specifically generated and it
> is a long URL, but if you give this URL to anyone (or if anyone sniffs
> the network), they can access this photo.
I'm not sure of the specifics of their URL generation, but that does not
sound like a security method.
As you say if anyone can find the URL they get access. It's just enough to
cause obscurity where very many URL are flooding around the place at once
to prevent people tracking what is whose.
>
> What I am thinking about doing is to generate time sensitive URLs (URL
> will be valid for 20 minutes). When application generates a page and
> places a link to this photo in the page, the URL will be something
> like http://www.y.com/asdkhjasd01.gif?t=time_t&hash=z and I can use
> Squid redirector plugin to verify the timestamp and allow access to
> it.
No need for that. You can do the above way you said flickr did, but set an
Expires: header of 20 minutes, Cache-Control: must-revalidate and create a
new Last-Modified header timestamp every time the security lapses.
Each time the URL is requests every proxy hop and browser involved will
check back with you about whether the URL is still available. If not or if
Expires: data has been reached they dump their copy and fetch a new one
from you if asked again afterwards.
You will see a lot of stuff about people forcing youtube etc videos and
other stuff to be cached and overriding the youtube settings. Don't be
worried by that. It happens because youtube go out of their way to force
people to waste bandwidth and thus money. As long as you permit caching and
have reasonably sane expiry times on the big objects things people are
generally happy to follow your settings and let the small revalidation
requests going through every time.
>
> So, my question is do you think Flickr's scheme is secure enough? If
> so, what I will be doing is even more secure, and it will let squid
> cache it for 20 minutes.
'Secure', no. 'Enough', yes.
>
> Can you please recommend anything better and/or more secure and fast?
Only real authentication is better.
>
> I would appreciate it if you can share your experience in this matter.
> Thanks.
Amos
Received on Tue Jul 28 2009 - 01:00:52 MDT
This archive was generated by hypermail 2.2.0 : Tue Jul 28 2009 - 12:00:05 MDT