automatic login and download file
Moderators: FourthWorld, heatherlaine, Klaus, kevinmiller
Re: automatic login and download file
I even try to go on, but it is a little 'complicated, but I'm sure you can do.
When can I signal the progress that you have made to compare with what I've done.
Thnks you
When can I signal the progress that you have made to compare with what I've done.
Thnks you
Re: automatic login and download file
This is a bit of an adaptation of some functions I use on other sites, tailored for ddlstorage. Hopefully you can follow it, if you have any questions, let me know.
libURLDownloadToFile should be a better way than just "put url ("http://blah") into url ("binfile:" & saveLocation) to handle large files, it's more robust and you could keep an eye on progress.
libURLDownloadToFile should be a better way than just "put url ("http://blah") into url ("binfile:" & saveLocation) to handle large files, it's more robust and you could keep an eye on progress.
Code: Select all
button script:
on mouseUp
--set up some basic variables
put "www.ddlstorage.com" into tSitename
put "thefilestring" into tFilestring --tFilestring is the letter/number string which is appended to the url to identify the file you are trying to download
put "thefilename.ext" into tFilename --don't really need this one as the filename is extracted from the returned page anyway
put "username" into tUsername
put "password" into tPassword
put specialFolderPath("desktop") into tSaveLocation -- or where you choose
-- the following function will download the file if available and return a status code
put fetchDownloadFile (tSitename,tFilestring,tFilename,tUsername,tPassword,tSavelocation) into field "fldStatus"
end mouseUp
Code: Select all
card script:
function fetchDownloadFile pSitename,pFilestring,pFilename,pUsername,pPassword,pSaveLocation
global gStatus
set the httpHeaders to empty
--construct the first set of parameters to post to the login URL to obtain credentials (some different variations also work but this one seemed consistent)
put libURLFormData("op","login","redirect",("http://" & pSitename & "/" & pFilestring),"login",pUsername,"password",pPassword) into tData
--making a direct post to the login.html page means we can obtain authenticated session credentials
--and then fetch the right pages without having to follow redirects and parse them all on the fly
post tData to url("http://" & pSitename & "/login.html")
--save the returned page data
put it into tHtmlPage
--this is the important part to be able to find out what session cookies were set
put libUrlLastRHHeaders() into tRHeaders
--parse returned headers to extract important cookies
get matchText(tRHeaders,"Set-Cookie: (aff=.*); domain",tAffCookie)
get matchText(tRHeaders,"Set-Cookie: (login=.*); domain",tLoginCookie)
get matchText(tRHeaders,"Set-Cookie: (xfss=.*); domain",tXfssCookie)
put tLoginCookie & ";" & tXfssCookie into tCookie
if tAffCookie is not empty then
put ";" & tAffCookie after tCookie
end if
--fake the athentication cookies
set the httpHeaders to empty
--this function sets up some static headers that a typical browser would generate
--and specific Host:, Cookie: and Referer: headers
put fnGetStaticHeaders(pSitename,tCookie,pSitename & "/login.html") into tHeaders
set the httpHeaders to tHeaders
--get the redirected page content
put "http://" & pSitename & "/" & pFilestring into tUrl
put url(tUrl) into tHtmlPage
put libUrlLastRHHeaders() into tRHeaders
--parse returned headers to extract important? cookies
get matchText(tRHeaders,"Set-Cookie: (aff=.*); domain",tAffCookie)
if tAffCookie is not empty then
put ";" & tAffCookie after tCookie
end if
--check if the returned page was gzipped so we can decompress the html page if necessary
if matchText (tRHeaders,"Content-Encoding: gzip") then
put decompress (tHtmlPage) into tHtmlPage
end if
--parse returned page content to extract the form fields to post onto next page
get matchText (tHtmlPage,"name=\"op\".*value=\"(.*)\">",tOpValue)
get matchText (tHtmlPage,"name=\"id\".*value=\"(.*)\">",tIdValue)
get matchText (tHtmlPage,"name=\"rand\".*value=\"(.*)\">",tRandFileValue)
--tRandFileValue is important to identify the right file against tFilestring (also supplied in this form as the "id" input value)
get matchText (tHtmlPage,"name=\"referer\".*value=\"(.*)\">",tRefererValue)
get matchText (tHtmlPage,"name=\"method_free\".*value=\"(.*)\">",tMethodFreeValue)
get matchText (tHtmlPage,"name=\"method_premium\".*value=\"(.*)\">",tMethodPremiumValue)
set the httpHeaders to empty
put fnGetStaticHeaders(pSitename,tCookie,pSitename & "/login.html") into tHeaders
set the httpHeaders to tHeaders
--use the parsed values to make a new post
put libURLFormData("op",tOpValue,"id",tIdValue,"rand",tRandFileValue,"referer",tRefererValue,"method_free",tMethodFreeValue,"method_premium",tMethodPremiumValue) into tData
libURLFollowHttpRedirects true
post tData to url ("http://" & pSitename & "/" & pFilestring)
put it into tHtmlPage
put libUrlLastRHHeaders() into tRHeaders
--check if the returned page was gzipped so we can decompress the html page if necessary
if matchText (tRHeaders,"Content-Encoding: gzip") then
put decompress (tHtmlPage) into tHtmlPage
end if
--extract the URL path to the file for downloading
put "parent.location='(.*)'" & quote & "></input>" into tNeedle
if matchText(tHtmlPage,tNeedle,tDownloadUrl) then
--we have a url to the download, so :
--set up callback status
libURLSetStatusCallback "updateStatus",the long ID of me
--ensure the correct athentication/cookie collection is submitted
put tDownloadUrl into tHostServer
replace "http://" with empty in tHostServer
set the itemDelimiter to "/"
put item 1 of tHostServer into tHostServer
set the httpHeaders to empty
put fnGetStaticHeaders(tHostServer,tCookie) into tHeaders
set the httpHeaders to tHeaders
--read the filename from the url (you don't need to have passed a filename to the main function really)
put the last item of tDownloadUrl into tFilename
set the itemDelimiter to comma
libURLDownloadToFile tDownloadUrl,(pSaveLocation & "/" & tFilename),"downloadComplete"
--when download is completed it will trigger the "downloadComplete" function
--but in the meantime, just wait and let the callback update the status field
wait until gStatus is among the items of "Download complete,error,timeout" with messages
else
put "No download URL extracted" into gStatus
end if
return gStatus
end fetchDownloadFile
function fnGetStaticHeaders pHost,pCookie,pReferer
if pHost is not empty then
put "Host:" && pHost & cr into tHeaders
end if
put "User-Agent: Mozilla/5.0 (Windows NT 6.1; rv:23.0) Gecko/20100101 Firefox/23.0" after tHeaders
put cr & "Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8" after tHeaders
put cr & "Accept-Language: en-gb,en;q=0.5" after tHeaders
put cr & "Accept-Encoding: gzip,deflate" after tHeaders
put cr & "Connection: keep-alive" after tHeaders
if pReferer is not empty then
put cr & "Referer:" && pReferer after tHeaders
end if
if pCookie is not empty then
put cr & "Cookie: lang=english;" & pCookie after tHeaders
end if
return tHeaders
end fnGetStaticHeaders
on updateStatus pUrl, pStatus
global gStatus
put pStatus into gStatus
put gStatus && the milliseconds into field "fldStatus"
--you can use this callback to make a progress bar for the user to watch the download rate in progress if you wanted
end updateStatus
on downloadComplete
global gStatus
put "Download complete" into gStatus
libURLSetStatusCallback
end downloadComplete
Re: automatic login and download file
Your was a great job, thank you, now I try to change some things, but it works, I need to understand better the functioning and actions on coockie.
Definitely take advantage of your wisdom for the explanations;)
Definitely take advantage of your wisdom for the explanations;)
Re: automatic login and download file
hello, thanks again for the code, I made some changes and everything works as I needed, I wanted to ask you some explanation, because this site easybytez. com I use the same method of DDLStorage basatto on php xfilesharingpro I do not work with this code? I modified some as login with login2 and the rest of the php site is identical, but not riesoc to have the link to the file.
You know explain to me why?
You know explain to me why?
Re: automatic login and download file
You have to find out what headers are sent back from the site to then parse session cookies, so you can set matching cookie headers in your next request. You then have to find what fields are being requested and post the correct name=value pairs. The specifics of what you need to find and send depend entirely on the individual site, so you will have to parse and extract the parts you need on a case by case basis.
Re: automatic login and download file
Hi , thanx for your reply ;)
is what I'm trying to do ... but I can not figure out where I'm wrong, in addition to this after the request at times it seems that the server blocks the ip address and I have to reset the router ... do you have any suggestions on why this happens ?
is what I'm trying to do ... but I can not figure out where I'm wrong, in addition to this after the request at times it seems that the server blocks the ip address and I have to reset the router ... do you have any suggestions on why this happens ?
Re: automatic login and download file
Probably because you have not made an initial post to the login page and read the headers returned to extract the right cookies for your session to continue to be authenticated. The cookies returned for ddlstorage will not be the same ones for another site, so you will have to check what is being returned, by examining libUrlLastRHHeaders and taking the important parts from there. There may be other features and form fields to post as well. All you can do is try, check the results and try again.
Re: automatic login and download file
hello, Pretty esempre to respond so quickly, in fact ddlstorage.com and easybytez.com backin.net and are based on the same script, different cookies definitely ask but I do the initial request and reading and read the headers but I can not understand what are those that serve to stay connected.
the first request is working because I get the answer with cookies this is the response libUrlLastRHHeaders for backin.net
---------------------------------------------------
HTTP/1.1 302 Moved
Server: cloudflare-nginx
Date: Tue, 15 Oct 2013 20:29:07 GMT
Content-Type: text/html
Transfer-Encoding: chunked
Connection: keep-alive
Set-Cookie: __cfduid=dd9765927c65fb66f2149759cb018b4f61381868946690; expires=Mon, 23-Dec-2019 23:50:00 GMT; path=/; domain=.backin.net; HttpOnly
Set-Cookie: login=carlos2013; domain=.backin.net; path=/; expires=Sun, 13-Apr-2014 20:30:25 GMT
Set-Cookie: xfss=zdc1b71q3wil2lim; domain=.backin.net; path=/; expires=Thu, 14-Nov-2013 20:30:25 GMT
Location: http://backin.net/?op=my_files
CF-RAY: bdf4274decc02dc
------------------------------------------------------------
the problem is even after changing the headers deny me access. in this the part where I think the problem early and deny me access.
-----------------------------------------------------------------
--fake the athentication cookies
set the httpHeaders to empty
--this function sets up some static headers that a typical browser would generate
--and specific Host:, Cookie: and Referer: headers tcray is CF-RAY BUT I DON'T SURE IS NECESSARY
put fnGetStaticHeaders(pSitename,tCookie,pSitename & "/login.html",tcray) into tHeaders
set the httpHeaders to tHeaders
answer tHeaders
--get the redirected page content
put "http://" & pSitename & "/" & pFilestring into tUrl
put url(tUrl) into tHtmlPage
put libUrlLastRHHeaders() into tRHeaders
answer tRHeaders
-----------------------------------------
you have any advice to give me more.
I hope uyou can help me to understand this ;);)
the first request is working because I get the answer with cookies this is the response libUrlLastRHHeaders for backin.net
---------------------------------------------------
HTTP/1.1 302 Moved
Server: cloudflare-nginx
Date: Tue, 15 Oct 2013 20:29:07 GMT
Content-Type: text/html
Transfer-Encoding: chunked
Connection: keep-alive
Set-Cookie: __cfduid=dd9765927c65fb66f2149759cb018b4f61381868946690; expires=Mon, 23-Dec-2019 23:50:00 GMT; path=/; domain=.backin.net; HttpOnly
Set-Cookie: login=carlos2013; domain=.backin.net; path=/; expires=Sun, 13-Apr-2014 20:30:25 GMT
Set-Cookie: xfss=zdc1b71q3wil2lim; domain=.backin.net; path=/; expires=Thu, 14-Nov-2013 20:30:25 GMT
Location: http://backin.net/?op=my_files
CF-RAY: bdf4274decc02dc
------------------------------------------------------------
the problem is even after changing the headers deny me access. in this the part where I think the problem early and deny me access.
-----------------------------------------------------------------
--fake the athentication cookies
set the httpHeaders to empty
--this function sets up some static headers that a typical browser would generate
--and specific Host:, Cookie: and Referer: headers tcray is CF-RAY BUT I DON'T SURE IS NECESSARY
put fnGetStaticHeaders(pSitename,tCookie,pSitename & "/login.html",tcray) into tHeaders
set the httpHeaders to tHeaders
answer tHeaders
--get the redirected page content
put "http://" & pSitename & "/" & pFilestring into tUrl
put url(tUrl) into tHtmlPage
put libUrlLastRHHeaders() into tRHeaders
answer tRHeaders
-----------------------------------------
you have any advice to give me more.
I hope uyou can help me to understand this ;);)
Re: automatic login and download file
This is the part in the previous script which parsed the returned headers to be able to set your cookies:The login and, it appears, xfss cookies should be extracted OK. The first cookie is nothing to do with aff, it starts __cfduid - which ends in uid so may be the "Unique ID" cookie. Try a few more times and see if there is always a __cfuid cookie being set. If so, then you can change above toIf you find that CF-RAY is required (it's not a "cookie" as such) you could extract it in the same sort of way : get matchText(tRHeaders,"CF-RAY: (.*)",tCray) (assuming it's always the last line of the headers).
Calling the fnGetStaticHeaders function with 4 parameters as it is, is not useful. The function is not expecting a tCray parameter so you would have to add something in the function to deal with it.
Code: Select all
--parse returned headers to extract important cookies
get matchText(tRHeaders,"Set-Cookie: (aff=.*); domain",tAffCookie)
get matchText(tRHeaders,"Set-Cookie: (login=.*); domain",tLoginCookie)
get matchText(tRHeaders,"Set-Cookie: (xfss=.*); domain",tXfssCookie)
put tLoginCookie & ";" & tXfssCookie into tCookie
if tAffCookie is not empty then
put ";" & tAffCookie after tCookie
end if
Code: Select all
--parse returned headers to extract important cookies
get matchText(tRHeaders,"Set-Cookie: (__cfuid=.*);",tUidCookie)
get matchText(tRHeaders,"Set-Cookie: (login=.*); domain",tLoginCookie)
get matchText(tRHeaders,"Set-Cookie: (xfss=.*); domain",tXfssCookie)
put tUidCookie & ";" & tLoginCookie & ";" & tXfssCookie into tCookie
Calling the fnGetStaticHeaders function with 4 parameters as it is, is not useful. The function is not expecting a tCray parameter so you would have to add something in the function to deal with it.
Re: automatic login and download file
changes to cookies between Aff and __ cfduid I had made and I also checked out the cf-ray,
but the second request tells me 403 forbidden, is a security cloudflare.
but the second request tells me 403 forbidden, is a security cloudflare.
Re: automatic login and download file
Well I can't tell you any more without full site and login details. It's a painstaking matter of analysing the headers as they are sent and received using a browser, then extracting the requirements from a header returned by a request made from livecode to match, including referer, and any other site specifics. Of course, assuming you get a matching session to proceed, you will need to read the page and extract the right name and value pairs to post to get to the next page. You say the script is identical, but there must be something in particular you need to parse. You just have to make sure you track down the right details, there's no way that can be made up.
Re: automatic login and download file
Thanks for the reply, I have analyzed the headers, logins are the same ddlstorage (), the site is http://www.backin.net uses cloudflare to check accesses do not know if this can be useful to know.
If you can verify and having anda explanation would be great.
I hope you can understand and help me.
thnx for all
If you can verify and having anda explanation would be great.
I hope you can understand and help me.
thnx for all
Last edited by boysky76 on Wed Oct 16, 2013 10:07 pm, edited 1 time in total.
Re: automatic login and download file
I meant "I'm not psychic" not "give me your login". Please delete your personal information.
I will see if I can tell if you need to change something simple, but I can't make any promises.
I will see if I can tell if you need to change something simple, but I can't make any promises.
Re: automatic login and download file
I thank you for everything, I hope you can understand the problem