0

I have written a little script using python3 that gets a RSS news feed using the feedparser library.

I then loop through the entries (dictionary) and then use a try/except block to insert the data into a MySQL db using pymysql (originally I tried to use MySQLDB but read here and other places that is does not work with Python3 or above)

I originally followed the PyMySQL example on git hub, however this did not work for me and I had to use different syntax for pymysql like they have here on digital ocean. However this worked for me when I tested out their example on their site.

But when I tried to incorporate it into my query,there was an error as it would not run the code the try block and just ran the exception code each time.

Here is my code;

#! /usr/bin/python3

# web_scraper.py 1st part of the project, to get the data from the 
# websites and store it in a mysql database

import cgitb
cgitb.enable()

import requests,feedparser,pprint,pymysql,datetime
from bs4 import BeautifulSoup
conn = pymysql.connect(host="localhost",user="root",password="pass",db="stories",charset="utf8mb4")
c = conn.cursor()

def adbNews():
    url = 'http://feeds.feedburner.com/adb_news'
    d = feedparser.parse(url)
    articles = d['entries']
    for article in articles:
        dt_obj = datetime.datetime.strptime(article.published,"%Y-%m-%d %H:%M:%S")
        try:
           sql = "INSERT INTO articles(article_title,article_desc,article_link,article_date) VALUES (%s,%s,%s,%s,%s)"
           c.execute(sql,(article.title, article.summary,article.link,dt_obj.strftime('%Y-%m-%d %H:%M:%S'),))
         conn.commit()
        except Exception:
            print("Not working")


adbNews()

I am not entirely sure what I am doing wrong. I have converted the string so that it is the format for the MySQL DATETIME type. As I originally did not have this but each time I run the program nothing gets stored in the db and the exception gets printed.

EDIT:

After reading Daniel Roseman's comments I removed the try/except block and read the errors that python gave me. It was to do with an extra argument in my sql query.

Here is he edited working code;

#! /usr/bin/python3

# web_scraper.py 1st part of the project, to get the data from the 
# websites and store it in a mysql database

import cgitb
cgitb.enable()

import requests,feedparser,pprint,pymysql,datetime
from bs4 import BeautifulSoup
conn = pymysql.connect(host="localhost",user="root",password="pass",db="stories",charset="utf8mb4")
c = conn.cursor()

def adbNews():
    url = 'http://feeds.feedburner.com/adb_news'
    d = feedparser.parse(url)
    articles = d['entries']
    for article in articles:
        dt_obj = datetime.datetime.strptime(article.published,"%Y-%m-%d %H:%M:%S")
        #extra argument was here removed now
        sql = "INSERT INTO articles(article_title,article_desc,article_link,article_date) VALUES (%s,%s,%s,%s)"
        c.execute(sql,(article.title, article.summary,article.link,dt_obj.strftime('%Y-%m-%d %H:%M:%S'),))
        conn.commit()

adbNews()
10
  • 1
    Never ever use phrases like did not work, there is an error etc. State the exact problem Commented Jan 17, 2017 at 12:10
  • 2
    In your case you can't find out the exact error because of extremely poor exception handling. Don't catch broad exceptionsl ike that. if you do at least import traceback; traceback.print_exec() Commented Jan 17, 2017 at 12:11
  • @e4c5 Edited the phrase hope its better now? Will try something out with the exception handling now. Commented Jan 17, 2017 at 12:15
  • In your case you can't find out the exact error because of extremely poor exception handling. Don't catch broad exceptionsl ike that. if you do at least import traceback; traceback.print_exec() Commented Jan 17, 2017 at 12:16
  • 2
    Or just remove the try/except completely, and let Python tell you what's wrong. Commented Jan 17, 2017 at 12:17

0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.