Asked  7 Months ago    Answers:  5   Viewed   37 times

Is there any way I can access the thumbnail picture of any wikipedia page by using an API? I mean the image on the top right side in box. Is there any APIs for that?

 Answers

29

http://en.wikipedia.org/w/api.php

Look at prop=images.

It returns an array of image filenames that are used in the parsed page. You then have the option of making another API call to find out the full image URL, e.g.: action=query&titles=Image:INSERT_EXAMPLE_FILE_NAME_HERE.jpg&prop=imageinfo&iiprop=url

or to calculate the URL via the filename's hash.

Unfortunately, while the array of images returned by prop=images is in the order they are found on the page, the first can not be guaranteed to be the image in the info box because sometimes a page will include an image before the infobox (most of the time icons for metadata about the page: e.g. "this article is locked").

Searching the array of images for the first image that includes the page title is probably the best guess for the infobox image.

Wednesday, March 31, 2021
 
BartmanEH
answered 7 Months ago
74

Some pointers:

  • Gmail: Google Contacts Data API
  • Yahoo:
    • Address Book API (deprecated)
    • Contacts API
  • Hotmail: Windows Live Contacts API

Most of the scripts, etc. I found didn't really work. There's a commercial one which uses scraping to retrieve the contacts but I think those attempts are rather flawed. The above links should give you an example of how to access the address books on each service.

Saturday, June 12, 2021
 
JackTheKnife
answered 5 Months ago
69

I suggest that you use a Map<String, Integer> instead:

Create the map by doing

Map<String, Integer> values = new HashMap<String, Integer>();

Then change

int temp = 10;

to

values.put("temp", 10);

and access the value using

int tempVal = values.get(temp_name);
Saturday, June 19, 2021
 
SuperString
answered 4 Months ago
53

I had lots of problems with figuring out this particular format and tools which could help me with reading/parsing it... Eventually, I ended up with using IRIS which ended up in extremely messy setup. I did this a year and a half ago on Ubuntu machine with Python 2.7. I am pasting the steps I needed to do in order to install IRIS and all possible dependencies, hopefully this will be of some use to you although there are probably lots of outdated versions in the whole setup...

Good luck!

# http://scitools.org.uk/iris/docs/latest/installing.html

# necessary steps for a clean machine
#
# sudo apt-get install gcc python-dev build-essential python-setuptools libpq-dev git unzip cmake
# pip install virtualenv
# pip install virtualenvwrapper
# mkdir ~/.virtualenvs
# nano ~/.bashrc
# export WORKON_HOME=$HOME/.virtualenvs
# source /usr/local/bin/virtualenvwrapper.sh
# . ~/.bashrc
# mkvirtualenv iris

pip install numpy
pip install biggus

sudo apt-get install libblas-dev liblapack-dev libatlas-base-dev gfortran
# OR
# sudo apt-get install python-numpy python-scipy python-matplotlib ipython ipython-notebook python-pandas python-sympy python-nose

pip install scipy

# cartopy dependecies
pip install cython

# https://github.com/OSGeo/proj.4
wget http://download.osgeo.org/proj/proj-4.9.1.tar.gz
tar -xzf proj-4.9.1.tar.gz
cd proj-4.9.1/
./configure
make
make install

# OR
# sudo apt-get install libproj-dev
# sudo apt-get install libgeos-dev

# http://sourceforge.net/projects/pyke/files/pyke/
wget http://sourceforge.net/projects/pyke/files/pyke/1.1.1/pyke-1.1.1.zip/download
unzip download
cd pyke-1.1.1/
python setup.py build
python setup.py install

pip install cartopy

# netcdf4 dependecies
# https://code.google.com/p/netcdf4-python/wiki/UbuntuInstall
# wget https://www.hdfgroup.org/ftp/HDF5/current/src/hdf5-1.8.16.tar (??????????????)
sudo apt-get install libhdf5-dev

pip install h5py

sudo apt-get install python-netcdf libnetcdf-dev libnetcdf4

git clone https://github.com/Unidata/netcdf4-python.git
cd netcdf4-python
python setup.py build
python setup.py install


# other iris dependencies
pip install cf_units

sudo apt-get install libudunits2-dev

nano ~/.bashrc
# check where exactly xml file is
export UDUNITS2_XML_PATH=/usr/local/share/doc/udunits/udunits2.xml
. ~/.bashrc

pip install pillow

# gribapi
# https://software.ecmwf.int/wiki/display/GRIB/GRIB+API+CMake+installation
wget https://software.ecmwf.int/wiki/download/attachments/3473437/grib_api-1.14.4-Source.tar.gz?api=v2
tar -xzf grib_api-1.14.4-Source.tar.gz?api=v2
mkdir build; cd build
cmake ../grib_api-1.14.0-Source -DENABLE_PYTHON=ON
make -j4
ctest -j4
make install

# OR
# sudo apt-get install libgrib-api-dev
# sudo apt-get install openjpeg-tools

cp -R /usr/local/lib/python2.7/site-packages/grib_api ~/.virtualenvs/iristest/lib/python2.7/site-packages/

# aaaand, here we go, iris!
git clone https://github.com/SciTools/iris.git
cd iris
python setup.py build
python setup.py install
# rejoice!

# pip freeze output:
# Biggus==0.12.0
# Cartopy==0.13.0
# cf-units==1.0.0
# Cython==0.23.4
# h5py==2.5.0
# Iris==1.10.0.dev0
# netCDF4==1.2.2
# numpy==1.10.2
# Pillow==3.0.0
# pyke==1.1.1
# pyshp==1.2.3
# scipy==0.16.1
# Shapely==1.5.13
# six==1.10.0
Sunday, August 15, 2021
 
aslum
answered 2 Months ago
73

There is a bug in the current version of the Wikipedia API python library. You can install a branch by lucasdnd on github that fixed this:

pip install git+https://github.com/lucasdnd/Wikipedia.git

(You can --upgrade if you already have it installed)

Now:

>>> import wikipedia
>>> ny = wikipedia.page("New York")
>>> ny.sections
[u'History', u'16th century', u'17th century', u'18th century, the American Revolution, and statehood', u'19th century', u'Immigration', u'September 11, 2001 attacks', u'Hurricane Sandy, 2012', u'Geography', u'Climate', u'Statescape', u'Regions', u'Adjacent geographic entities', u'State parks', u'National parks', u'Administrative divisions', u'Demographics', u'Population', u'Most populous counties', u'Major cities', u'Metropolitan areas', u'Racial and ancestral makeup', u'Languages', u'Religion', u'LGBT', u'Economy', u'Wall Street', u'Silicon Alley', u'Microelectronic hardware and photographic processing', u'Media and entertainment', u'Tourism', u'Exports', u'Education', u'Transportation', u'Government and politics', u'Government', u'Capital punishment', u'Federal representation', u'Politics', u'Sports', u'See also', u'References', u'Further reading', u'External links'] 

It'll hopefully be fixed in the main library sometime soon.

Thursday, October 7, 2021
 
muaaz
answered 2 Weeks ago
Only authorized users can answer the question. Please sign in first, or register a free account.
Not the answer you're looking for? Browse other questions tagged :