CAPTAIN: Comprehensive Composition Assistance for Photo Taking

Abstract: Many people are interested in taking astonishing photos and sharing with others. Emerging high-tech hardware and software facilitate ubiquitousness and functionality of digital photography. Because composition matters in photography, researchers have leveraged some common composition techniques, such as the rule of thirds, the triangle technique, and the perspective-related techniques, to assess the aesthetic quality of photos computationally. However, composition techniques developed by professionals are far more diverse than well-documented techniques can cover. We leverage the vast under-explored innovations in photography for computational composition assistance, and there is a lack of a holistic framework to capture important aspects of a given scene and help individuals by constructive clues to take a better shot in their adventure. We propose a comprehensive framework, named CAPTAIN (Composition Assistance for Photo Taking), containing integrated deep-learned semantic detectors, sub-genre categorization, artistic pose clustering, personalized aesthetics-based image retrieval, and style set matching. The framework is backed by a large dataset crawled from a photo-sharing Website with mostly photography enthusiasts and professionals.
The work proposes a sequence of steps that have not been explored in the past by researchers.
The work addresses personal preferences for composition through presenting a ranked-list of photographs to the user based on user-specified weights in the similarity measure. We believe our design leveraging user-defined preferences. Our framework extracts ingredients of a given snapshot of a scene (i.e. the scene that the user is interested in taking a picture of) as a set of composition-related features ranging from low-level features such as color, pattern, and texture to high-level features such as pose, category, rating, gender, and object. Our composition model, indexed offline, is used to provide visual exemplars as recommendations for the scene, which is a novel model for aesthetics-related image retrieval. We believe our design leveraging user-defined preferences The matching algorithm recognizes the best shot among a sequence of shots with respect to the user’s preferred style set. We have conducted a number of experiments on the newly proposed components and reported findings. A user study demonstrates that the work is useful to those taking photos.

Keywords: Computational Composition, Image Aesthetics, Photography, Deep Learning, Image Retrieval

captain_springer_ijcv

Auction-based Resource Management in Computer Architecture

ABSTRACT

Resource management systems rely on a centralized approach to manage applications running on each resource. The centralized resource management system is not efficient and scalable for large-scale servers as the number of applications running on shared resources is increasing dramatically and the centralized manager may not have enough information about applications’ need.

This work proposes a decentralized auction-based resource management approach to reach an optimal strategy in a resource competition game. The applications learn through repeated interactions to select their optimal action for shared resources. Specifically, we investigate two case studies of cache competition game and main processor and coprocessor congestion game. We enforce costs for each resource and derive bidding strategy. Accurate evaluation of the proposed approach show that our distributed allocation is scalable and outperforms the static and traditional approaches.

Full article > tpds-carma

Semantic Scholar Information is not Stable!

The website is spreading unstable info about many scholars, their papers and their citations. Just try my name:
www.semanticscholar.org/search?q=farshid%20farhat&sort=relevance
www.semanticscholar.org/author/Farshid-Farhat/2052042

It was better to supervise/verify the info before making public on the web, otherwise shut it down! I’m totally disappointed with SemanticScholar!

CAGE: Contention-Aware Game-theoretic modEl

Abstract

Traditional resource management systems rely on a centralized approach to manage users running on each resource. The centralized resource management system is not scalable for large-scale servers as the number of users running on shared resources is increasing dramatically and the centralized manager may not have enough information about applications’ need. In this paper we propose a distributed game-theoretic resource management approach using market auction mechanism to find optimal strategy in a resource competition game. The applications learn through repeated interactions to choose their action on choosing the shared resources. Specifically, we look into two case studies of cache competition game and main processor and coprocessor congestion game. We enforce costs for each resource and derive bidding strategy. Accurate evaluation of the proposed approach show that our distributed allocation is scalable and outperforms the static and traditional approaches.

Draft > CAGE

 

Split AT&T Bill in a Shared Data Plan

Getting fair shares for a shared data plan is always disaster! Because it is not a simple +/- but also there are a lot of overheads included inside the charges such as tax and extras, and also AT&T doesn’t divide the data charge and overage among the lines but only one line!

The attached EXCEL file contains the details of the processing of the separate shares of a 13-line AT&T data shared plan. To start with it, you should fill out the yellow boxes at the bottom of the file. You can get help from “View my bill” page at AT&T website. The instructions can be as follows:

  1. Fill out total bill balance (Total), number of lines (T#), total amount of data (TData), AT&T base charge per line (ATT Base) by looking at “View my bill” page at AT&T website.
  2. Fill out insurances (Insurance column) and installments (Installment column) again by by looking at “View my bill” page at AT&T website.
  3. If any line has an extra charge (for international calls, etc.), calculate extra charge by subtracting ATT_BASE from the actual balance in front of the line. [Note: AT&T charges one line for the whole data. So don’t try to consider it as an extra charge.]
  4. If the plan had a data overage, you can fill out the used data by each line in Data column (shown as yellow) from “Check Usage” page at AT&T website. A user-defined function (UDF), UDF(x)=ROUNDUP(x*(1+SIGN(x))/2,0); x is the distance of the line data usage from the mean data usage, has been used to compute the data overage share among the lines.

Now, you can verify the results:

  1. Green balances shown in ATT Website column should be equal with the balances in “View my bill” page at AT&T website.
  2. Also you can verify the overpay balance, calculated balance, etc.

Finally the fair shares would be in “Separate” column.

ATT_2017_08

Maryam Mirzakhani, the mother who won Fields Medal

Unbelievable and heartbreaking! Our role model since elementary school dies at 40! I still remember the days I was struggling to learn her book written for young math lovers to prepare for math Olympiad. She was ahead of us as a senior but she passed all the elevation steps very fast, and soon she became Stanford professor while she had a little girl.

Her milestone completed when she won Fields Medal in Math (the most prestigious award equivalent to Nobel prize). As a woman, her accomplishment is not only inspiring for all Iranians but also specially for all women in fundamental sciences. Definitely her work helps the other mathematicians in the field, but her humble character wasn’t self-explanatory.

Unfortunately her life was short but fruitful for all of us not only from scientific aspects but also other societal aspects as a public figure like Galois, Ramanujan, and Riemann.  May she rest in peace.

 

Deep-learned Models and Photography Idea Retrieval

Intelligent Portrait Composition Assistance (IPCA)
Farshid Farhat, Mohammad Kamani, Sahil Mishra, James Wang
ACM Multimedia 2017, Mountain View, CA, USA
(Acceptance rate = 64/495 = 12.93%)

ABSTRACT: Retrieving photography ideas corresponding to a given location facilitates the usage of smart cameras, where there is a high interest among amateurs and enthusiasts to take astonishing photos at anytime and in any location. Existing research captures some aesthetic techniques such as the rule of thirds, triangle, and perspective, and retrieves useful feedbacks based on one technique. However, they are restricted to a particular technique and the retrieved results have room to improve as they can be limited to the quality of the query. There is a lack of a holistic framework to capture important aspects of a given scene and give a novice photographer informative feedback to take a better shot in his/her photography adventure. This work proposes an intelligent framework of portrait composition using our deep-learned models and image retrieval methods. A highly-rated web-crawled portrait dataset is exploited for retrieval purposes. Our framework detects and extracts ingredients of a given scene representing as a correlated hierarchical model. It then matches extracted semantics with the dataset of aesthetically composed photos to investigate a ranked list of photography ideas, and gradually optimizes the human pose and other artistic aspects of the composed scene supposed to be captured. The conducted user study demonstrates that our approach is more helpful than the other constructed feedback retrieval systems.

Please cite our paper if you are using our professional portrait dataset.

@inproceedings{Farhat:2017:IPC:3126686.3126710,
author = {Farhat, Farshid and Kamani, Mohammad Mahdi and Mishra, Sahil and Wang, James Z.},
title = {Intelligent Portrait Composition Assistance: Integrating Deep-learned Models and Photography Idea Retrieval},
booktitle = {Proceedings of the on Thematic Workshops of ACM Multimedia 2017},
series = {Thematic Workshops ’17},
year = {2017},
isbn = {978-1-4503-5416-5},
location = {Mountain View, California, USA},
pages = {17–25},
numpages = {9},
url = {http://doi.acm.org/10.1145/3126686.3126710},
doi = {10.1145/3126686.3126710},
acmid = {3126710},
publisher = {ACM},
address = {New York, NY, USA},
keywords = {deep learning, image aesthetics, image retrieval., photographic composition, portrait photography, smart camera},
}

 

IAAP Scholarship Award

Dear Mr. Farshid Farhat,

On behalf of Dr. Manouchehr Farkhondeh, President of the Iranian American Academics and Professionals (IAAP) and the entire organization, I would like to congratulate you on your outstanding academic success as well as the invaluable service to the Iranian-American community. We are also delighted to inform you that you have been selected as one of the recipients of IAAP’s Scholarship Awards for 2017.

We look forward to the opportunity to congratulate you in person and present you the Award Certificate and the accompanying check at our special gathering on June 24, 2017 (Information of the ceremony is posted at http://iaadc.net/).

We ask that you kindly acknowledge receipt of this email at your earliest convenience and be prepared to introduce yourself and provide a short background on the program you are in and any studies or project(s) you might be working on for a total of 3 to 4 minutes.

You will be among our honored guests, and as such exempt from making the customary fee, but to order enough food for everyone, we request that you register for the event at your earliest convenience, if not done so already. Your guests, if any, are also welcomed and expected to RSVP early but are required to pay the nominal dinner fees.

In case you are truly unable to participate in the ceremony, your certificate and the check can be given to a designated person provided the receipt of your notification by Tuesday, June 20th at the latest, or they can be mailed to you at a specified address. Also, in case you are unable to be with us in person, please make sure to send us a short video on the information we requested on the third paragraph with the notification by Sunday June 18th as well.

Sincerely,
Yasaman Ardeshirpour, PhD
Scholarship Committee Chair

frash 🙂