General Idea:
Have a community site with 2 areas,a front end (ideas) one in the backend, where a user COM will login, and will get posts from different users, from twitter, fb, blogs, news, youtube , and different types of pages and social networks, An admin will be adding (via user posts, algorithm, api and crawlers) posts into the database, every post will be attached into a category, admin user, and sort with date, hour, user who added this post, url, friends, followers, followings, listed, etc.
The idea is COM will login, and will see a general cloud, will say to admin to start generating new tags ex:“president” then every time that he log in, all the results from api+db+ideas will be showing in main page. Every time that COM is searching for something, the ADMIN will say to crawler to extract Serached TAGS and get the post into my db. So next time will show results including this saved data.
There will be clouds, grapfics, serach, date filters,
Post will show also, visits,followers, following, listed, sentiment (positive, neutral, negative), and bookmark like save to fav, save to topic, assign to user, keep track, share this, ATTACH TO MAP. And also all connectivity with social networks
ATTACH TO MAP: all posts users will be saved into a geo place in google map, the google map will be a well done map, whith some categories (hotel, store, building, etc)
ONE CAT WILL BE IDEAS: There will be a category idea, there will be an independent front end section, user posts, here will look like any social network, with top ranked, popular, and others, the idea of this is after user loggins, will be able to POST/CONTRIBUTE/COMMENT/VOTE/INTERVIEW
INTERVIEW: a user, or investor is another type of user, who will be able t interview a user so he can get more info about an idea. This will be in GMAP IDEAS, INDEPENDETLY.
The admin ideas, will be able to have a work flow in the back end, so he can recommend ideas, and start recommending users to take further steps, like fill in a business plan, a cash flow, and other templates documents.
Details:
1. Log in and pass
2. Add a tag search
3. Crawler starts looking at api results and saving in db
4. Admin will start saving data extracting his posts and the his friends posts, or his friends friends posts (in levels, level1 his posts, level 2 his friends posts, level 3, friends of level 2 users, friends (followers, following, listed, refered pages, links, etc always with option to select 1 or more)
5. Admin will manually adding posts into categories, also will select those hi wants to cron (selecting time: every min, every hour, day, week, month,)
6. Admin will set crawller and will schedule a cron where it will start extracting data from user or url, in levels, so if level 2 is selected, it will start extarctimng data from him and also his friends (or inside links)+
7. All saved data will refer in db, the tags, the sourse type (fb, twitter, blog, page, news, etc)friends, followers, followings, links, sentiment, topic, cat, also there will be a workflow (actions and status(status actions) like send to admin user,save in fav, save in cat, reply, alarm, cloud.
8. Cloud: there will be a general cloud then a cloud for each search, then after clic in one tag of the cloud will leve results withing the tag (smaller results every time you clic in one tag)
9. Sentiment: the sentiment will be assign by user (in ideas) and also admin will select manually, there will be always an automatic sentiment, dependen in MY TEXT MINNING SOFTWARE