JavaEar 专注于收集分享传播有价值的技术资料

Yahoo Finance URL not working

I have been using the following URL to fetch historical data from yahoo finance for quite some time now but it stopped working as of yesterday.

When browsing to this site it says:

Will be right back...

Thank you for your patience.

Our engineers are working quickly to resolve the issue.

However, since this issue is still existing since yesterday I am starting to think that they discontinued this service?

My SO search only pointed me to this topic, which was related to https though...

Is anyone else experiencing this issue? How can I resolve this problem? Do they offer a different access to their historical data?


  1. The URL for downloading historical data is now something like this:

    Note the above URL will not work for you or anyone else. You'll get something like this:

        "finance": {
            "error": {
                "code": "Unauthorized",
                "description": "Invalid cookie"

    It seems that Yahoo is now using some hashing to prevent people from accessing the data like you did. The URL varies with each session so it's very likely that you can't do this with a fixed URL anymore.

    You'll need to do some scrapping to get the correct URL from the main page, for example:

  2. 参考答案2
  3. For the python lovers out there, I've updated the in tradingWithPython library.

    There is also an example notebook based on the tips by Ed0906, demonstrating how to get the data step by step. See it on

  4. 参考答案3
  5. I'm in the same boat. Getting there slowly. The download link on the historical prices page still works. So I added the export cookies extension to firefox, logged in to yahoo, dumped the cookies. Used the crumb value from interactive session and I was able to retrieve values. Here's part of a test perl script that worked.

    use Time::Local;
    # create unix time variables for start and end date values: 1/1/2014 thru 12/31/2017
    $p1= timelocal(0,0,0,1,0,114);
    $p2= timelocal(0,0,0,31,11,117);
    $symbol = 'AAPL';
    # create variable for string to be executed as a system command
    # cookies.txt exported from firefox
    # crumb variable retrieved from yahoo download data link
    $task = "wget --load-cookies cookies.txt --no-check-certificate -T 30 -O          $symbol.csv \"$symbol?period1=$p1&period2=$p2&interval=1d&events=history&crumb=7WhHVu5N4e3\" ";
    #show what we're executing
    print $task;
    # execute system command using backticks
    #output is AAPL.csv

    It'll take a while to automate what I do. Hopefully yahoo will simplify or give some guidance on it if they really intend for people to use it.

  6. 参考答案4
  7. For java lovers.

    You can access your cookies from a URLConnection this way.

     //  "";
     URLConnection con = url.openConnection();
     for (Map.Entry<String, List<String>> entry : con.getHeaderFields().entrySet()) {
            if (entry.getKey() == null 
                || !entry.getKey().equals("Set-Cookie"))
            for (String s : entry.getValue()) {
               // store your cookie

    now you can search for the crumb in the yahoo site:

    String crumb = null;
    InputStream inStream = con.getInputStream();
    InputStreamReader irdr = new InputStreamReader(inStream);
    BufferedReader rsv = new BufferedReader(irdr);
    Pattern crumbPattern = Pattern.compile(".*\"CrumbStore\":\\{\"crumb\":\"([^\"]+)\"\\}.*");
    String line = null;
    while (crumb == null && (line = rsv.readLine()) != null) {
        Matcher matcher = crumbPattern.matcher(line);
        if (matcher.matches()) 
            crumb =;

    and finally, setting the cookie

    String quoteUrl = ""
                               + crumb
    List<String> cookies = cookieStore.get(key);
    if (cookies != null) {
        for (String c: cookies) 
            con.setRequestProperty("Cookie", c);
  8. 参考答案5
  9. Fully working PHP example, based on this post and related sources:

    function readYahoo($symbol, $tsStart, $tsEnd) {
        file_get_contents('' . $symbol),
        $crumb);  // can contain \uXXXX chars
      if (!isset($crumb['crumb'])) return 'Crumb not found.';
      $crumb = json_decode('"' . $crumb['crumb'] . '"');  // \uXXXX to UTF-8
      foreach ($http_response_header as $header) {
        if (0 !== stripos($header, 'Set-Cookie: ')) continue;
        $cookie = substr($header, 14, strpos($header, ';') - 14);  // after 'B='
      }  // cookie looks like "fkjfom9cj65jo&b=3&s=sg"
      if (!isset($cookie)) return 'Cookie not found.';
      $fp = fopen('' . $symbol
        . '?period1=' . $tsStart . '&period2=' . $tsEnd . '&interval=1d'
        . '&events=history&crumb=' . $crumb, 'rb', FALSE,
        stream_context_create(array('http' => array('method' => 'GET',
          'header' => 'Cookie: B=' . $cookie))));
      if (FALSE === $fp) return 'Can not open data.';
      $buffer = '';
      while (!feof($fp)) $buffer .= implode(',', fgetcsv($fp, 5000)) . PHP_EOL;
      return $buffer;


    $csv = readYahoo('AAPL', mktime(0, 0, 0, 6, 2, 2017), mktime(0, 0, 0, 6, 3, 2017));
  10. 参考答案6
  11. It looks like they have started adding a required cookie, but you can retrieve this fairly easily, for example:


    Responds with the header in the form:

    set-cookie:B=xxxxxxxx&b=3&s=qf; expires=Fri, 18-May-2018 00:00:00 GMT; path=/;

    You should be able to read this and attach it to your .csv request:

    cookie: B=xxxxxxxx&b=3&s=qf;

    Note the crumb query parameter, this seems to correspond to your cookie in some way. Your best bet is to scrape this from the HTML response to your initial GET request. Within that response, you can do a regex search for: "CrumbStore":\{"crumb":"(?<crumb>[^"]+)"\} and extract the crumb matched group.

    It looks like once you have that crumb value though you can use it with the same cookie on any symbol/ticker for the next year meaning you shouldn't have to do the scrape too frequently.

    To get current quotes just load:


    • AAPL substituted with your stock ticker
    • interval one of [1m, 2m, 5m, 15m, 30m, 60m, 90m, 1h, 1d, 5d, 1wk, 1mo, 3mo]
    • optional period1 query param with your epoch range start date e.g. period1=1510340760
    • optional period2 query param with your epoch range end date e.g. period2=1510663712
  12. 参考答案7
  13. Python

    I used this code to get cookie (copied from fix-yahoo-finance):

    def get_yahoo_crumb_cookie():
        """Get Yahoo crumb cookie value."""
        res = requests.get('')
        yahoo_cookie = res.cookies['B']
        yahoo_crumb = None
        pattern = re.compile('.*"CrumbStore":\{"crumb":"(?P<crumb>[^"]+)"\}')
        for line in res.text.splitlines():
            m = pattern.match(line)
            if m is not None:
                yahoo_crumb = m.groupdict()['crumb']
        return yahoo_cookie, yahoo_crumb

    then this code to get response:

    cookie, crumb = get_yahoo_crumb_cookie()
    params = {
        'symbol': stock.symbol,
        'period1': 0,
        'period2': int(time.time()),
        'interval': '1d',
        'crumb': crumb,
    url_price = '{symbol}'
    response = requests.get(url_price, params=params, cookies={'B': cookie})

    This looks nice as well

  14. 参考答案8
  15. I am the author of this service

    Basic info here

    Daily prices

    You need to be familiar with RESTFUL services.

    Historical prices

    You have to provide a date range :

    If you don't provide begin or end it will use the earliest or current date:

    Multiple tickers

    You can just comma separate tickers:,MSFT&begin=2012-02-19

    Rate limit

    All requests are rate limited to 10 requests per hour. If you want to register for a full access API send me DM on twitter. You will receive an API key to add to the URL.

    We are setting up a paypal account for paid subscription without rates.

    List of tickers available

    I am working also to provide fundamental data and company data from EDGAR. Cheers.

  16. 参考答案9
  17. VBA

    Here are some VBA functions that download and extract the cookie / crumb pair and return these in a Collection, and then use these to download the csv file contents for a particular code.

    The containing project should have a reference to the 'Microsoft XML, v6.0' library added (other version might be fine too with some minor changes to the code).

    Sub Test()
        Dim X As Collection
        Set X = FindCookieAndCrumb()
        Debug.Print X!cookie
        Debug.Print X!crumb
        Debug.Print YahooRequest("AAPL", DateValue("31 Dec 2016"), DateValue("30 May 2017"), X)
    End Sub
    Function FindCookieAndCrumb() As Collection
        ' Tools - Reference : Microsoft XML, v6.0
        Dim http    As MSXML2.XMLHTTP60
        Dim cookie  As String
        Dim crumb   As String
       Dim url     As String
        Dim Pos1    As Long
        Dim X       As String
        Set FindCookieAndCrumb = New Collection
        Set http = New MSXML2.ServerXMLHTTP60
        url = ""
        http.Open "GET", url, False
        ' http.setProxy 2, "https=", ""
        ' http.setRequestHeader "Accept", "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8"
        ' http.setRequestHeader "Accept-Encoding", "gzip, deflate, sdch, br"
        ' http.setRequestHeader "Accept-Language", "en-ZA,en-GB;q=0.8,en-US;q=0.6,en;q=0.4"
        http.setRequestHeader "User-Agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36"
        X = http.responseText
        Pos1 = InStr(X, "CrumbStore")
        X = Mid(X, Pos1, 44)
        X = Mid(X, 23, 44)
        Pos1 = InStr(X, """")
        X = Left(X, Pos1 - 1)
        FindCookieAndCrumb.Add X, "Crumb"
        X = http.getResponseHeader("set-cookie")
        Pos1 = InStr(X, ";")
        X = Left(X, Pos1 - 1)
        FindCookieAndCrumb.Add X, "Cookie"
    End Function
    Function YahooRequest(ShareCode As String, StartDate As Date, EndDate As Date, CookieAndCrumb As Collection) As String
        ' Tools - Reference : Microsoft XML, v6.0
        Dim http            As MSXML2.XMLHTTP60
        Dim cookie          As String
        Dim crumb           As String
        Dim url             As String
        Dim UnixStartDate   As Long
        Dim UnixEndDate     As Long
        Dim BaseDate        As Date
        Set http = New MSXML2.ServerXMLHTTP60
        cookie = CookieAndCrumb!cookie
        crumb = CookieAndCrumb!crumb
        BaseDate = DateValue("1 Jan 1970")
        If StartDate = 0 Then StartDate = BaseDate
        UnixStartDate = (StartDate - BaseDate) * 86400
        UnixEndDate = (EndDate - BaseDate) * 86400
        url = "" & ShareCode & "?period1=" & UnixStartDate & "&period2=" & UnixEndDate & "&interval=1d&events=history&crumb=" & crumb
        http.Open "GET", url, False
        http.setRequestHeader "Cookie", cookie
        YahooRequest = http.responseText
    End Function
  18. 参考答案10
  19. I used a php script using fopen() to access the financial data, here are the snippets that I modified to get it back to work:

    Creating the timestamps for start date and end date:

    $timestampStart = mktime(0,0,0,$startMonth,$startDay,$startYear);
    $timestampEnd = mktime(0,0,0,$endMonth,$endDay,$endYear);

    Force fopen() to send the required cookie with hard coded values:

    $opts = array(
            'header'=>"Accept-language: en\r\n" .
                "Cookie: B=".$cookie."\r\n"
    $context = stream_context_create($opts);    

    Use fopen() to get the csv file:

    $handle = fopen("".$ticker."?period1=".$timestampStart."&period2=".$timestampEnd."&interval=1d&events=history&crumb=".$crumb."", "r", false, $context);

    Now you can do all the magic you did before inside this while loop:

    while (!feof($handle) ) {
        $line_of_text = fgetcsv($handle, 5000);

    Make sure to set your own values for $ticker, $crumb and $cookie in the snippets above. Follow Ed0906's approach on how to retrieve $crumb and $cookie.

  20. 参考答案11
  21. For those Excel/VBA users I have used the suggestions above to develop a VBA method to extract historical prices from the updated Yahoo website. The key code snippets are listed below and I have also provided my testing workbook.

    First a request to get the Crumb and Cookie values set before attempting to extract the data from Yahoo for the prices..

    Dim strUrl                      As String: strUrl = ""    'Symbol lookup used to set the values
    Dim objRequest                  As WinHTTP.WinHttpRequest
    Set objRequest = New WinHttp.WinHttpRequest
    With objRequest
        .Open "GET", strUrl, True
        .setRequestHeader "Content-Type", "application/x-www-form-urlencoded; charset=UTF-8"
        strCrumb = strExtractCrumb(.responseText)
        strCookie = Split(.getResponseHeader("Set-Cookie"), ";")(0)
    End With

    See the following Yahoo Historical Price Extract link to my website for a sample file and more details on the method I have used to extract historical security prices from the Yahoo website

  22. 参考答案12
  23. I had found another yahoo site that does not require cookies, but generates jason output:

    it was pointed out from here:

    As it turned out they seem to support 'perod1' and 'period2' (in unix time) parameters which could be used instead of the 'interval'.

    String quoteSite = ""
                       + symbolName + "?"
                       + "period1=" + period1
                       + "&period2=" + period2
                       + "&interval=1d&indicators=quote&includeTimestamps=true";

    And the following parses Jason for me:

    JSONObject topObj = new JSONObject(inp);
    Object error = topObj.getJSONObject("chart").get("error");
    if (!error.toString().equals("null")) {
        return null;
    JSONArray results = topObj.getJSONObject("chart").getJSONArray("result");
    if (results == null || results.length() != 1) {
        return null;
    JSONObject result = results.getJSONObject(0);
    JSONArray timestamps = result.getJSONArray("timestamp");
    JSONObject indicators = result.getJSONObject("indicators");
    JSONArray quotes = indicators.getJSONArray("quote");
    if (quotes == null || quotes.length() != 1) {
        return null;
    JSONObject quote = quotes.getJSONObject(0);
    JSONArray adjcloses = indicators.getJSONArray("adjclose");
    if (adjcloses == null || adjcloses.length() != 1) {
       return null;
    JSONArray adjclose = adjcloses.getJSONObject(0).getJSONArray("adjclose");
    JSONArray open = quote.getJSONArray("open");
    JSONArray close = quote.getJSONArray("close");
    JSONArray high = quote.getJSONArray("high");
    JSONArray low = quote.getJSONArray("low");
    JSONArray volume = quote.getJSONArray("volume");
  24. 参考答案13
  25. 参考答案14
  26. I managed to work out a .NET class to obtain valid token (cookie and crumb) from Yahoo Finance

    For complete API library in fetching historical data from new Yahoo Finance, you may visit YahooFinanceAPI in Github

    Here is the class to grab the cookie and crumb


    using System;
    using System.Diagnostics;
    using System.Net;
    using System.IO;
    using System.Text.RegularExpressions;
    namespace YahooFinanceAPI
        /// <summary>
        /// Class for fetching token (cookie and crumb) from Yahoo Finance
        /// Copyright Dennis Lee
        /// 19 May 2017
        /// </summary>
        public class Token
            public static string Cookie { get; set; }
            public static string Crumb { get; set; }
            private static Regex regex_crumb;
            /// <summary>
            /// Refresh cookie and crumb value Yahoo Fianance
            /// </summary>
            /// <param name="symbol">Stock ticker symbol</param>
            /// <returns></returns>
            public static bool Refresh(string symbol = "SPY")
                    Token.Cookie = "";
                    Token.Crumb = "";
                    string url_scrape = "{0}?p={0}";
                    //url_scrape = "{0}/history"
                    string url = string.Format(url_scrape, symbol);
                    HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(url);
                    request.CookieContainer = new CookieContainer();
                    request.Method = "GET";
                    using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
                        string cookie = response.GetResponseHeader("Set-Cookie").Split(';')[0];
                        string html = "";
                        using (Stream stream = response.GetResponseStream())
                            html = new StreamReader(stream).ReadToEnd();
                        if (html.Length < 5000)
                            return false;
                        string crumb = getCrumb(html);
                        html = "";
                        if (crumb != null)
                            Token.Cookie = cookie;
                            Token.Crumb = crumb;
                            Debug.Print("Crumb: '{0}', Cookie: '{1}'", crumb, cookie);
                            return true;
                catch (Exception ex)
                return false;
            /// <summary>
            /// Get crumb value from HTML
            /// </summary>
            /// <param name="html">HTML code</param>
            /// <returns></returns>
            private static string getCrumb(string html)
                string crumb = null;
                    //initialize on first time use
                    if (regex_crumb == null)
                        regex_crumb = new Regex("CrumbStore\":{\"crumb\":\"(?<crumb>.+?)\"}", 
                    RegexOptions.CultureInvariant | RegexOptions.Compiled, TimeSpan.FromSeconds(5));
                    MatchCollection matches = regex_crumb.Matches(html);
                    if (matches.Count > 0)
                        crumb = matches[0].Groups["crumb"].Value;
                        Debug.Print("Regex no match");
                    //prevent regex memory leak
                    matches = null;
                catch (Exception ex)
                return crumb;

    Updated 1 Jun 17
    credits to @Ed0906
    modify crumb regex pattern to Regex("CrumbStore\":{\"crumb\":\"(?<crumb>.+?)\"}"

  27. 参考答案15
  28. There is a fix that I have found to work well. Please see my post:

    Yahoo Finance API / URL not working: Python fix for Pandas DataReader where I followed the steps in to: $ pip install fix_yahoo_finance --upgrade --no-cache-dir (and also upgraded pandas_datareader to be sure) and tested ok:

    from pandas_datareader import data as pdr
    import fix_yahoo_finance
    data = pdr.get_data_yahoo('BHP.AX', start='2017-04-23', end='2017-05-24')

    Also note that the order of the last 2 data columns are 'Adj Close' and 'Volume' so for my purpose, I have reset the columns to the original order:

    cols = ['Date', 'Open', 'High', 'Low', 'Close', 'Volume', 'Adj Close']
    data = data.reindex(columns=cols)
  29. 参考答案16
  30. I was on the same boat. I managed to get the CSV downloaded from Yahoo with some frankencode I made from bits and pieces off Google, SOF and some head-scratching.

    However, I discovered Intrinio (look it up), signed up, and my free account gets me 500 historic data api calls a day, with much more data and much more accurate than Yahoo. I rewrote my code for the Intrinio API, and I'm happy as a clam.

    BTW, I don't work or have anything to do with Intrinio, but they saved my butt big time...

  31. 参考答案17
  32. Yahoo has gone to a Reactjs front end which means if you analyze the request headers from the client to the backend you can get the actual JSON they use to populate the client side stores.

    These calls seem to be load balanced between: &

    With the following path at either one of the hostnames I listed above you can query financial statements, insider activity, SEC filings, analyst estimates..(the module names are pretty self explanatory and I'll list them below)

    Fundamental data path:(substitute your ticker symbol for AAPL)


    Options for the ?modules= query are as follows:

    modules = [ 'assetProfile', 'incomeStatementHistory', 'incomeStatementHistoryQuarterly', 'balanceSheetHistory', 'balanceSheetHistoryQuarterly', 'cashFlowStatementHistory', 'cashFlowStatementHistoryQuarterly', 'defaultKeyStatistics', 'financialData', 'calendarEvents', 'secFilings', 'recommendationTrend', 'upgradeDowngradeHistory', 'institutionOwnership', 'fundOwnership', 'majorDirectHolders', 'majorHoldersBreakdown', 'insiderTransactions', 'insiderHolders', 'netSharePurchaseActivity', 'earnings', 'earningsHistory', 'earningsTrend', 'industryTrend', 'indexTrend', 'sectorTrend' ]

    A full url querying for the ticker AAPL and the modules assetProfile, defaultKeyStatistics, and earningsHistory would look like this:

    The %2C is just the Hex representation of a , and needs to be inserted between each module you request. If you care to know more about the encoding detail look at this stackoverflow post

    You can query for any combination of the modules I've listed that you like. But don't be a d*!^. A query of all the fields is just under 300kb so if you downloaded all of the NYSE & NASDAQ it would be around 2GB, certainly large enough for yahoo to notice. This is a really convenient 'api' and I'd hate to see them shut it down. Please take the time to figure out what data you really need and only query for that dataset. ~~take what you need and leave the rest..~~(The Band)

    Options contract path:


    That call will give you the JSON for the current expiration month. To retrieve future expirations you need to add a date query:

    ?date= The date value will be an integer that represents the contract expiration date as a UNIX timestamp(that's a stackoverflow post explaining methods for converting unix timestamps to readable date formats in Python).

    Price request path:


    That path will give you all the available price data for ticker AAPL grouped in intervals of 3 months.

    You can get interval=1m for about the past 4-5 days. You can get interval=5m for the past 80(ish) days. I assume internally their cut off is based on trading days so my naive approach to creating past values for period1 is a little blurry because I wasn't accounting for holidays and such(but that might be an over simplified assumption. How far back you can go with different intervals is a little confusing and inconsistent). If you've created a request that is valid but the interval is not supported yahoo will return interval=3mo. So just move the period1 value forward in time until you get the interval that is supported.

    Add pre & post market prices


    Add dividends & splits


    An example request for all price data for ticker AAPL on a 1 day interval including pre and post market action as well as dividends and splits would be:

    The 9999999999 value for period2= is only significant in that it's a value greater than or equal to todays unix timestamp.

    Be respectful and maybe throw YAHOO a bone by clicking on some adds once in a while. Let them know you appreciate the service. They could have easily buried all this information so it would be nearly impossible to discover but they didn't and that's not by accident.

    I've written a personal library that uses random user-agent rotation and routes requests through a rotation of anonymous proxies to keep the origin IP of the requests private to avoid blacklisting. It also throttles requests to a reasonable rate as to not be unfair to the yahoo servers. If there is interest I'm willing to opensource it on github (it's actually part of a distributed computing library I wrote that scales EC2 instances on amazon for data processing. So I'd have to take the time to pull it apart).

  33. 参考答案18
  34. 参考答案19
  35. It's possible to get current and historical data from google finance api. Works very good for me.

  36. 参考答案20
  37. 参考答案21
  38. Javascript

    Find cookie;

    match = document.cookie.match(new RegExp('B=([^;]+)'));
    alert (match[1]);

    Find crumb;"CrumbStore")
    if (i>=0) alert (document.body.innerHTML.substr(i+22,11))

    Find crumb for mobile;'USER={\"crumb\":'); 
    if (i>=0) alert(document.body.innerHTML.substr(i+15,11));

    and it's probably best to wait for the page (e.g to load up first, you can check it with;

  39. 参考答案22
  40. An alternative approach to those mentioned so far (Yahoo, Google and Intrinio) is to get the historical data from Alpha Vantage for free. Their web service delivers intra-day, daily, adjusted stock prices and 50+ technical indicators. They even deliver straight to Excel - also for free - through Deriscope. (I am the author of the latter.)