Saturday 2 November 2013

HOW effective ARE YOU......

HOW effective ARE YOU......

Here there is nothing like percentage to check you effectiveness but it will help you 100% to be effective and best of what you have.

This is truly a psychological and a perfect way to find how efficient you are in your daily life.

Many of us just live our life as usual but somehow we always try to make it better and effective than the previous day.
                           We are usually busy with thousands of thoughts in our brain, but you must have never thought about how actually our brain works.

One best part of our brain is that we can train it as we want, its the best part of our brain, but there is also a disadvantage of this, 
                                            If  training of brain is done in wrong way then it will lead to sorrow, unhappiness, defeat at work and many more.
                                            So a perfect training is important to be successful and happy.

TIPS TO TRAIN YOUR BRAIN 

  1. Learn everything good around you, keep smiling in all difficult situation and never forget that, it will take time to be what you want to be just like it took years for you to grow.


  2. Control you brain to a particular task for a particular time it will increase your accuracy and speed.
  3. Think about your next step before you take it, plan for perfection.
  4. Plan your day in such a way that you don't miss any thing.
  5. Everyday should include good food, sufficient sleep, healthy exercise. Exercise should be done for both brain and body. 
  6. Brain exercise will include puzzle games, study, research etc. 
  7. In body exercise try to target on different parts of body everyday. At least walk few kilometers if you don't love exercise.
  8. Sleep sufficiently as it repairs both brain and body and make you stronger.
  9. Try to make habits of the above mention steps and don't stop here, mind is a 'endless power' utilize it and make more steps and always remember "What you sow is What you reap". So don't stop make yourself effective.


Sunday 27 October 2013

Why? Root an Android Device


 Now a days rooting an Android device is quite an easy job. But very few of us actually know what rooting is all about. 
                               Today we are going to  see how actually rooting takes place and what we actually need to root an Android device.

 Process Details:
                           
The process of rooting varies widely by device, but usually includes exploiting a security bug(s) in the firmware (i.e. in Android) of the device, and then copying the su binary to a location in the current process's PATH (e.g. /system/xbin/su) and granting it executable permissions with the chmod command. A supervisor application like SuperUser or SuperSU can regulate and log elevated permission requests from other applications. Many guides, tutorials, and automatic processes exist for popular Android devices facilitating a fast and easy rooting process.
For example, shortly after the HTC Dream was released, it was quickly discovered that anything typed using the keyboard was being interpreted as a command in a privileged (root) shell. Although Google quickly released a patch to fix this, a signed image of the old firmware leaked, which gave users the ability to downgrade and use the original exploit to gain root access. Once an exploit is discovered, a custom recovery image that skips the digital signature check of a firmware update package can be flashed. In turn, using the custom recovery, a modified firmware update can be installed that typically includes the utilities (for example the Superuser app) needed to run apps as root.
The Google-branded Android phones, the Nexus One, Nexus S, Galaxy Nexus and Nexus 4, as well as their tablet counterparts, the Nexus 7 and Nexus 10, can be boot-loader unlocked by simply connecting the device to a computer while in boot-loader mode and running the Fastboot program with the command "fastboot oem unlock". After accepting a warning, the boot-loader is unlocked, so a new system image can be written directly to flash without the need for an exploit.
Recently, Motorola, LG Electronics and HTC added security features to their devices at the hardware level in an attempt to prevent users from rooting retail Android devices.For instance, the Motorola Droid X has a security boot-loader that puts the phone in "recovery mode" if a user loads unsigned firmware onto the device, and the Samsung Galaxy S II displays a yellow triangle indicator if the device firmware has been modified.

How to root: 
                     Though you got lots of link for a successful rooting techniques on WEB but a wrong rooting steps can brick your phone.

                              BRICK
                                             A stage where your phone will neither boot up nor start because of lack of proper configuration in OS with its hardware components.

                                              So ill suggest you to go for software like 1 click recovery etc. but you will need to pay a price for it but if you want to do it for free you can also download a free rooting to 
i.e

SRSRoot One Click Root for Android .

its completely free software till now.

NOTE: Rooting is a process of bypassing Android actual security which may   lead to bricking your phone and warranty is also exploited while your phone is rooted. So DO AT YOUR OWN RISK.

 

 ADVANTAGE OF ROOTING:
                                                                                    
                                                                               Rooting  has many advantage over unrooted phones.

1> Complete control over your applications i.e uninstall any application that you want but for this you will require tool like Titanium Backup which i recommend for root user only, its a all in 1 tool.

2> Boot up new custom roms and increase your device performance.

3> Run OS like UBUNTU on your android device and this is a fact since i have already done it.

4> Increase your RAM ie use SDCARD as a RAM.

5> Increase internal memory which can allow you to install more apps and games. Using softwares like linktosd, appstosd etc which need ROOT.

6> Boost up CPU by using overclocking it.

7> Save data usage using apps like derodwall.

8> Secure your internet connection using virtual apn apps like deroidapn.

9> Move any application to sdcard to save internal memory.
and many more .......
for any query join android development community like Android Forums
Thank you :) keep visiting us and do like us.

   



 
                      

Thursday 3 October 2013

Amazing youtube videos

Today's theme is more towards random Amazement

1) The deadly nano bugs

Air Force Bugbots Nano Drone video gives a peak inside what nano-drone 
technology the Federal Government is currently implementing within the 
united states 
The deadly, insect-sized drones of the future - Unobtrusive, Invasive, 
and Lethal are here as depicted in this old video

2) Amazing water trick

water turn into ice in 5sec

3) No more flat tires? Sounds good to me


4) Amazing detailed view of atom

Scientists at the University of California Los Angeles have found a 
way to create stunningly detailed 3D reconstructing of platinum 
nanoparticles at an atomic scale. These are being used to study 
tiny structural irregularities called dislocations.
many more to come stay tuned

What Happens When You Put iPhone 5C, 5S in a Blender :D :D :D -admin )(0Tsho+





crazy insane totally nutzzzzzzzzz..........! video

no words to express :D

this video is on a crazy blending scientist who blend stuffs :D :D

wanna seee more crazy watch dis video............!

CLICK THIS  LINK  https://www.youtube.com/watch?v=GAuhUTzNwiY

i cant watch it /.\



WARNING:- PEOPLE WITH  NO HEART ONLY CAN WATCH IT .........!

Wednesday 2 October 2013

production of Gasoline using biotechnology.......!


Novel Technology to Produce Gasoline by a Metabolically-Engineered Microorganism






Sep. 29, 2013 — Scientists succeeded in producing 580 mg of gasoline per litre of cultured broth by converting in vivo generated fatty acid.


For many decades, we have been relying on fossil resources to produce liquid fuels such as gasoline, diesel, and many industrial and consumer chemicals for daily use. However, increasing strains on natural resources as well as environmental issues including global warming have triggered a strong interest in developing sustainable ways to obtain fuels and chemicals.
Gasoline, the petroleum-derived product that is most widely used as a fuel for transportation, is a mixture of hydrocarbons, additives, and blending agents. The hydrocarbons, called alkanes, consist only of carbon and hydrogen atoms. Gasoline has a combination of straight-chain and branched-chain alkanes (hydrocarbons) consisted of 4-12 carbon atoms linked by direct carbon-carbon bonds.
Previously, through metabolic engineering of Escherichia coli (E. coli), there have been a few research results on the production of long-chain alkanes, which consist of 13-17 carbon atoms, suitable for replacing diesel. However, there has been no report on the microbial production of short-chain alkanes, a possible substitute for gasoline.
In the paper (entitled "Microbial Production of Short-chain Alkanes") published online in Nature on September 29, a Korean research team led by Distinguished Professor Sang Yup Lee of the Department of Chemical and Biomolecular Engineering at the Korea Advanced Institute of Science and Technology (KAIST) reported, for the first time, the development of a novel strategy for microbial gasoline production through metabolic engineering of E. coli.
The research team engineered the fatty acid metabolism to provide the fatty acid derivatives that are shorter than normal intracellular fatty acid metabolites, and introduced a novel synthetic pathway for the biosynthesis of short-chain alkanes. This allowed the development of platform E. coli strain capable of producing gasoline for the first time. Furthermore, this platform strain, if desired, can be modified to produce other products such as short-chain fatty esters and short-chain fatty alcohols.
In this paper, the Korean researchers described detailed strategies for 1) screening of enzymes associated with the production of fatty acids, 2) engineering of enzymes and fatty acid biosynthetic pathways to concentrate carbon flux towards the short-chain fatty acid production, and 3) converting short-chain fatty acids to their corresponding alkanes (gasoline) by introducing a novel synthetic pathway and optimization of culture conditions. Furthermore, the research team showed the possibility of producing fatty esters and alcohols by introducing responsible enzymes into the same platform strain.
Professor Sang Yup Lee said, "It is only the beginning of the work towards sustainable production of gasoline. The titre is rather low due to the low metabolic flux towards the formation of short-chain fatty acids and their derivatives. We are currently working on increasing the titre, yield and productivity of bio-gasoline. Nonetheless, we are pleased to report, for the first time, the production of gasoline through the metabolic engineering of E. coli, which we hope will serve as a basis for the metabolic engineering of microorganisms to produce fuels and chemicals from renewable resources."



COMPARISON BETWEEN THE BEST OS AVAILABLE .......! U CHOOSE ;)


Mac OSX Lion vs Windows 8 vs Ubuntu 11.10

First, a little bit of history: Long, long ago, when talks of climate change were in their infancy, 1GB of RAM was something geeks drooled over, and anything even related to blue tooth would have had you running to your doctor, I was a Linux user. Before that, before Windows 3.1 even, I used whatever interface I could get my hands on. But, when there finally became a choice to make, Linux was my choice.
About 7 years ago, due to lack of decent photo editing software, I switched to Windows: Windows 95, to be exact. Then Windows XP, then Windows Vista, and now, finally, Windows 7. It’s been a fun ride. I’ve complained all the way, of course, but jealousy over another operating system due to lack of software in my own simply never came up.
Fast forward to today. Most of my friends use a Mac. And when it comes to developers, graphic designers, writers, comic artists, and other creative types, basically ALL of my friends use a Mac. So I decided to try it. Mac OSX Lion, to be exact. I’m actually surprised it’s taken me this long to make “the switch”. What I’m not surprised about is the number of complaints I have.
I have so many complaints, in fact, that it made me curious, once again, about the state of Linux (in this case, specifically, Ubuntu 11.04) on the Desktop. So, what follows is a comparison of the three from the eyes of a geek, nerd, father, web application developer, photographer, hippie. Your opinion will, no doubt, vary, and that’s good. That’s how it should be.
There are several aspects to a specific platform. One of those is the Operating System itself: how core features like windowing, launching applications, and customization work. Secondly, there is the hardware itself, the features it provides, and the way those features are implemented. And finally, there is the software that allows the use and expansion of a platform.
Let me start with a quick synopsis of each platform. For Mac OSX Lion, the hardware is included. In this case, it’s a 2012 Model, Mac Book Pro w/ 15″ screen, 4 GB RAM, Intel 2.2 Ghz i5 processor. In the case of Windows 7 and Ubuntu 11.10, it’s an AMD X6 2.8Ghz 6 core processor, and 12 GB of RAM displayed in a 23″ LCD monitor. Totally different specs, but, due to the nature of the Mac platform, I was unable to test the Mac setup on anything other than Mac hardware. It should also be known that, as far as price goes, the Ubuntu/Windows machine was $950 (less for Ubuntu since the included copy of Windows 7 wouldn’t be required). The Mac Book Pro was $1800. So, almost double the price.
The first thing I noticed about Mac OSX is that, contrary to the claims of most of my friends, things do not “just work”. Despite at least 10 different Operating Systems and at least 5 different Windowing systems, Mac OSX left me confused. And despite being a fully recognized operating system, there are software gaps. The depth and breadth of what you see on the Windows platform is simply not there. And in many cases when it is there, it’s because someone has made a Linux application work on Mac OSX, which means that that application doesn’t fit the feel of Mac OSX and leaves the platform feeling disconnected and even more confusing. Finally, even some of the Apple provided applications don’t work in ways you would expect, even after you’ve gotten a feel for the OS.
Ubuntu still has the same problems it did 7 years ago. It looks nicer than it did, has more features than it used to, and is overall a fantastic system. But, just like before, it lacks the polish and cohesive nature of a more mature, more developed operating system. On top of that, while there are certainly more applications available, there still isn’t the depth and breadth that the Windows community enjoys. Installation is still problematic, and lesser used features simply don’t work.
Windows is windows. Some advancements have been made in the UI, many of which are quite nice and very welcome. However, the underlying architecture remains mostly the same. This means that installation and basic setup is painless. However, as soon as any additional applications are involved, the end result is an unknown depending on how that software uses the platform. Most applications fit the UI feel of Windows, but many do not. The UI does not feel as polished as Mac OSX and, in some cases, even Ubuntu wins out in this department. Extending the operating system beyond the original designs is almost impossible and most software that tries to do so fails miserably.
So now, on to specifics.

Mac OSX Lion
Mac OSX was breathtaking upon start up. The UI, though different than what I’m used to, feels polished and welcoming. Fonts are consistent. Button placement is obvious. Setup is incredibly simple. I’d feel comfortable handing a Mac to my Mom and expecting that, with a few pieces of information jotted down on a scrap of paper, she’d be able to set it up. Mac really wins in this department. Windows is complicated. There are basic things that need to be done on a new computer that Windows does not make obvious. Things like wireless setup, locating software to perform tasks, and creating a secure user account. Even Ubuntu (aside from actual
installation which I did not have to do with Mac OS X) does a better job of this than Windows does. In fact, in many aspects, Ubuntu gets very close to the Mac OSX feel during initial use.
Mac OSX, however, started to get confusing the first time I tried to install software. Installing via the App Store is easy and Apple approved. You click, it installs, and then it pops up a
window showing you where your application is and making it easy to get to. So as long as what you want is in the App Store, the Mac is a breeze. Unfortunately, as I said before, Mac suffers from a lack of depth and breadth in the application arena. And, if you are limited to only the App Store, it suffers even more so. So, inevitably, installing software outside of the App Store is bound to happen. And this is also where things get complicated. Some packages come as “.dmg” which the Mac opens and looks at as a separate hard drive. Many of these “.dmg”s offer a window that shows the icon for the new app, and an icon for your applications directory. It isn’t always obvious that the install happens when you drag one icon on to the other. However, I eventually figured that out and felt like I had a handle on things. Then I installed another piece of software. This one worked differently. This one decided to use a “Finder” window to display the contents of the “.dmg”. When one application creates a new Finder window, the Mac doesn’t make this obvious. So you’re left wondering if it even worked and clicking on things and trying to figure what to do next until you finally click on the “Finder” icon to go look for it and realize that it’s made a new window that needs your attention. The same thing happens if the application you’re downloading uses a “.zip” file for distribution. Once you get past this hurdle and remember to check Finder if things don’t work the way you expect, application installation gets pretty simple.A friend gave me a developer release of Mac OSX Mountain Lion. So I thought I’d try it. I had no idea that I needed a developer license in order to do this “legally”. So I tried it. It was broken. And that’s okay. It’s beta. That’s what betas are for. So I decided I wanted to go back. And that’s when I met the Mac User Community. These people are, largely, a bunch of jerks. Their response to other people asking similar questions to mine — “How do I get back to Lion” — was simply “you shouldn’t have installed it in the first place!” or “you made a backup didn’t you?” or “I can’t tell you unless you’re an Apple developer, and if you were an Apple developer you’d already know this.” This elitist and unwelcoming response doesn’t give you the same warm, fuzzy feeling that you get from most of the Windows and Ubuntu communities where people are looking to help one another.
When attempting to have multiple users logged in at once, I encountered another strange error. Portions of windows from one logged in account intermittently appeared in blotches on the screen of the other logged in account. This would be terrible if the account in the background had received a sensitive email or had other sensitive materials on the screen. Eventually, after this event, the system got so confused that I had to hold down the power button and just reboot.
The Mac hardware is stellar. The trackpad is big and easy to use. And the integration of that hardware with the software is outstanding. Scrolling, pointing, clicking all make perfect sense. Two finger scrolling on Windows is cumbersome, and only slightly less so on Ubuntu. But on the Mac, it’s just easy. The keyboard feels nice, the card reader does things that make sense, the power adapter is the easiest I’ve ever seen. It’s obvious why Apple chooses to pair Hardware and Software together as a cohesive platform, because the attention to detail in this area is simply fantastic.
I have only two complaints about hardware when it comes to Mac. The first, is that it simply costs too much. The price tag is off putting to most people who see systems with better specs sitting next to it for half the price. The second complaint is that Mac doesn’t offer anything for the truly POWER user. So if I need an application that will run fast, with lots of Ram and lots of processor power, Mac is simply not an option for that software because they don’t have the hardware to go with it. If I could get my hands on a 48 core processor with 256 GB of RAM, as long as it was made in a supported way, I could run Ubuntu or Windows on it. And since the hardware is more expensive on the Mac side of things, when it comes to sheer processing power, you’re always going to get more with Windows and even more with Ubuntu due to the seriously decreased overhead of the platform.
While I understand Mac’s desire to pair hardware and software, they are limiting themselves and their user base. If expanding their base of possible users is what they want, they need to come up with a hardware approval program that dictates the bare minimums. And they need to allow the Operating System to be installed on machines that meet or beat that criteria. I don’t even care what the criteria is, as long as their own hardware can pass the tests. But, if I can piece together hardware that meets or exceeds all the technical requirements (even if it includes things like an ambient light sensor, or a backlit keyboard) then the OS needs to be buyable and installable. This will allow the Mac platform to be used in cases where it would be a very good fit today if only Apple would allow it.

Ubuntu 11.10


The Ubuntu installation is terrible. I installed it on two different machines, actually. The first failed at the end of the install telling me it couldn’t install the MBR. Without any kind of explanation or system knowledge, there’s no way a “regular guy” would know what to do at that point. I’m not a regular guy. So I answer the question and, even then, I got it not-quite-right and had to fix something manually after reboot. Most people aren’t going to know how to do this and I don’t believe they should be expected to. On my second install, I tried to install Ubuntu alongside Windows. The install, again, failed, by selecting a portable USB drive as the installation point and then trying to put the MBR on that disk, which would have never been seen at boot up. Thankfully, I know what I’m doing and fixed the issue. My mom could not have installed Ubuntu.

Once installed, however, Ubuntu was easy. The UI is beautiful and friendly. It’s easy to find things. Easier, even, than Mac OSX. Customization is obvious and simple. I was quite pleased at how quickly I could get a basic system up and I was very comfortable handing this foreign interface to my girlfriend who had no trouble visiting websites and performing other basic tasks.
Once the surface was scratched, though, Ubuntu’s not-quite-stable nature showed it’s head. I was trying to connect to a shared drive on my network. A drive that Mac OSX and Windows 7 had no trouble connecting to (not to mention my XBOX 360, Android Phone, and Google TV). But Ubuntu simply couldn’t connect to it. It gave error messages that didn’t make sense. I understand more than most, and I was able to dig deeper and make it happen, but I would say that most people without extensive platform knowledge wouldn’t have been able to make it happen.
At another point, when I was performing an upgrade that the system popped up to recommend, the progress bar had stalled. Despite my clicks and attempts, it wasn’t moving. So I closed the window. This left the machine in a state where te keyboard did not work upon reboot and the system was absolutely unusable. Again, my prior knowledge came in handy as I booted into recovery mode, manually fixed the problem, and then rebooted. This is not something my Mom could have handled.
Ubuntu also offered me a choice in regard to display drivers. While the geek in me appreciates the choice, most users just want whatever works best (which was not the default). If this is an option Ubuntu wants to continue offering, then they need to make it more hidden and less obvious. In this way, a serious tinkerer will find it and make their choice, but the average user doesn’t need nor want to be bothered with this sort of thing.

Windows 8

Windows is just windows. The platform is powerless. Even the Mac (with severely less capability in the hardware department) felt faster at most tasks. And Ubuntu absolutely blew it out of the water. The system is bloated. The UI is complicated. And factory installed additions to the OS make it even less cohesive and more complicated. Once an installation has been cleaned up, it’s easy enough to use. But that is, in part, because people are just used to the way it works. Handing Windows to my Mom would result in some success, but also a lot of questions. She would get a lot further on her own with Mac OSX. And, assuming she didn’t bump into any of the absolutely broken thing in Ubuntu, she’d even get a lot further there. If Windows wants to continue to compete in this arena they are going to need a backend overhaul and some serious redesign in the UI.
Applications in Windows are second to none. If there is something I want my computer to do, no matter how obscure or niche, I stand the greatest chance of being able to do it on Windows. If I have some strange piece of hardware I want to use, it’ll probably work in Windows. That can’t be said for Mac OSX or Ubuntu. These platforms both need more hardware support and more Application Development. There’s really no such thing as TOO MUCH when it comes to these things, but there certainly is such a thing as NOT ENOUGH.
Most advanced tasks are made even more difficult by Window’s UI. With Ubuntu and Mac OSX, if the UI doesn’t support it, there is still a way to dig into the guts of the OS, tap into a huge user community, and make it happen any way. With Windows, if the UI doesn’t support it, in most cases, it can’t be done. And the things the UI supports are not always obvious. Bluetooth is a joke. Dealing with disk storage and intelligent file placement is a joke. The registry continues to be a nightmare and it’s only been augmented in recent years by a series of hidden directories that make it almost impossible to figure out what’s making a certain thing happen. Mac OSX handles this beautifully. Applications, Configuration, and supporting files are all well contained, easily found, and fully tinkerable, if you dare. Ubuntu is less-good, but still, completely manageable, especially considering the large, very helpful user community. In Windows, it feels like most things are hinged on a wink and a prayer.

Conclusion
If you can afford to pay double the price for the hardware, and are willing to pay double the price for some of the accessories you might need, and you don’t have a need for huge computing resources or very specific application requirements, Mac OSX is the way to go. It’s easy to use for those who just want to get something done. And the system underpinnings are well designed and advanced enough that someone with knowledge and skill in the platform can make almost anything happen.
If you can’t (or don’t want to) afford the price tag, have high computing resource needs, Ubuntu might be right for you. However, you’ll also need to be very knowledgable about the system in order to get some fairly common tasks done because things tend to break. You’ll also need  to be capable of doing research ahead of time for hardware and software to make sure Ubuntu suits you.
If you have need for high computing power, have specific hardware needs that are un met by the above options, or are unwilling to pay the Mac Tax and don’t have the technical skills to manage Ubuntu, then Windows is your only choice. It’s not a terrible choice, and I’m sure that, with an occasional call to a more technical friend, most people will be able to manage with Windows. Because of these things, I can’t see the population, en masse, to choose anything other than Windows.
If Apple is interested in making Mac OSX something for EVERYONE, then at a bare minium they will need to lower the price. An easy way to do this and appease power users and tinkerers is to implement an “approved hardware” system. This will let commodity hardware manufacturers design inexpensive systems that will run Mac OSX. Apple can maintain their current hardware line and those that have come to love Mac and pay the Mac Tax believe they are getting what they pay for and they will continue to do so. Users who need more power than a Mac Mini can buy an “off brand” system with Mac OSX installed knowing that they aren’t getting the very best in hardware, but that the hardware they do have is approved to work fully with Mac OSX. And the power users that need 8 cores and 24 GB of RAM can find or build a system that suits their needs. With this in place, the depth and breadth of application support Windows users enjoy will begin to be shared on the Mac Platform.
If Ubuntu is interested in making it something for everyone, they will need, more than anything else, to work out the kinks. The UI is beautiful. But if the processes the UI is built upon don’t work reliably in almost every case, then the UI is worthless because an expert is going to be required to sort out the mess any way. Once these bits are cleaned up and made “mom proof”, then Ubuntu needs to do more to pair itself with hardware manufacturers in order to enable more people to buy preinstalled Ubuntu systems so that the installation procedure because unimportant to the average user.
If Windows is interested in keeping the something for everyone that they’ve already made, they need to overhaul the underlying system. Even if that means throwing everything away and starting over (as Apple did with Mac OSX) then that’s the route they need to take. And while recent UI improvements are outstanding and very welcome, there’s a long way to go to making a system that’s as easy to use as Mac OSX or Ubuntu.
For me, in my home, I’m sold on Mac. I love the platform and, as situations arise where new machines are needed, I’ll be buying Mac whenever my wallet can handle the blow. However, I’m not about to throw away the well-made, fully-functional systems I already have. And since Apple refuses to let me use those systems with their OS even though they are more than capable, Ubuntu is the best choice for me because I’m capable of fixing something if it breaks. If I weren’t then Windows would be my only choice.

Tuesday 1 October 2013

Getting to Know the Universe

 


7 Surprising facts about the Universe.........

1. The Universe Is Old (Really Old)


The Big Bang: Solid Theory, But Mysteries Remain
The universe began with the Big Bang, and is estimated to be approximately 13.7 billion years old (plus or minus 130 million years). Astronomers calculated this figure by measuring the composition of matter and energy density in the universe, which enabled them to determine how fast the universe expanded in the past. As a result, researchers could turn back the hands of time and pinpoint when the Big Bang occurred. The time in between that explosion and now makes up the age of the universe.



2. The Universe Is Getting Bigger

Dark Energy Mystery Illuminated By Cosmic Lens

In the 1920s, astronomer Edwin Hubble made the revolutionary discovery that the universe is not static, but rather is expanding. But, it was long thought that the gravity of matter in the universe would slow this expansion or even cause it to contract. In 1998, the Hubble Space Telescope studied very distant supernovas and found that, a long time ago, the universe was expanding more slowly than it is today. This puzzling discovery suggested that an inexplicable force, called dark energy, is driving the accelerating expansion of the universe.
While dark energy is thought to be the strange force that is pulling the cosmos apart at ever-increasing speeds, it remains one of the greatest mysteries in science because its detection remains elusive to scientists.



 3. The Universe's Growth Spurt Is Accelerating

Einstein's 'Biggest Blunder' Turns Out to Be Right
Mysterious dark energy is not only thought to be driving the expansion of the universe, it appears to be pulling the cosmos apart at ever-increasing speeds. In 1998, two teams of astronomers announced that not only is the universe expanding, but it is accelerating as well. According to the researchers, the farther a galaxy is from Earth, the faster it is moving away. The universe's acceleration also confirms Albert Einstein's theory of general relativity, and lately, scientists have revived Einstein's cosmological constant to explain the strange dark energy that seems to be counteracting gravity and causing the universe to expand at an accelerating pace. 
Three scientists won the 2011 Nobel Prize in Physics for their 1998 discovery that the expansion of the universe was accelerating.




4. The Universe Could Be Flat

Deformed Galaxies Confirm Universe's Acceleration
The shape of the universe is influenced by the struggle between the pull of gravity (based on the density of the matter in the universe) and the rate of expansion. If the density of the universe exceeds a certain critical value, then the universe is "closed," like the surface of a sphere. This implies that the universe is not infinite but has no end. In this case, the universe will eventually stop expanding and start collapsing in on itself, in an event known as the "Big Crunch." If the density of the universe is less than the critical density value, then the shape of the universe is "open," like the surface of a saddle. In this case, the universe has no bounds and will continue to expand forever.
Yet, if the density of the universe is exactly equal to the critical density, then the geometry of the universe is "flat," like a sheet of paper. Here, the universe has no bounds and will expand forever, but the rate of expansion will gradually approach zero after an infinite amount of time. Recent measurements suggest that the universe is flat with roughly a 2 percent margin of error.


5. The Universe Is Filled With Invisible Stuff

Hubble Reveals Ghostly Ring of Dark Matter


The universe is overwhelmingly made up of things that cannot be seen. In fact, the stars, planets and galaxies that can be detected make up only 4 percent of the universe, according to astronomers. The other 96 percent is made up of substances that cannot be seen or easily comprehended. These elusive substances, called dark energy and dark matter, have not been detected, but astronomers base their existence on the gravitational influence that both exert on normal matter, the parts of the universe that can be seen.



6. The Universe Has Echoes of Its Birth

This all-sky image of the cosmic microwave background, created by the European Space Agency's Planck satellite, shows echoes of the Big Bang left over from the dawn of the universe.


 The cosmic microwave background is made up of light echoes left over from the Big Bang that created the universe 13.7 billion years ago. This relic of the Big Bang explosion hangs around the universe as a pocked veil of radiation.

The European Space Agency's Planck mission mapped the entire sky in microwave light to reveal new clues about how the universe began. Planck's observations are the most precise views of the cosmic microwave background ever obtained. Scientists are hoping to use data from the mission to settle some of the most debated questions in cosmology, such as what happened immediately after the universe was formed. 



7. There May Be More Universes




The idea that we live in a multiverse, in which our universe is one of many, comes from a theory called eternal inflation, which suggests that shortly after the Big Bang, space-time expanded at different rates in different places. According to the theory, this gave rise to bubble universes that could function with their own separate laws of physics. The concept is controversial and had been purely hypothetical until recent studies searched for physical markers of the multiverse theory in the cosmic microwave background, which is a relic of the Big Bang that pervades our universe.
Researchers searched the best available observations of the cosmic microwave background for signs of bubble universe collisions, but didn't find anything conclusive. If two universes had collided, the researchers say, it would have left a circular pattern behind in the cosmic microwave background.


Monday 30 September 2013

Memory Improving games


    Exercise Ur Brain Daily


FACTS

Like every Other muscles in ur body ur brain is also  a muscle which needs a workout plan , healthy diet , and routine

 it has been found that there are approx 1 trillion neurons in an adult brain thats quite a lot nd in order to make use of them we need to make connections between our neural synapses or neurons and this process is called as neuroplasticity


animated brain photo: Animated Brain animated_brain.gif
So start exercising ur brain

best way is to play memory games 

 

 

 

7 Surprising facts about the Universe

Getting to Know the Universe

 


7 Surprising facts about the Universe.........

1. The Universe Is Old (Really Old)


The Big Bang: Solid Theory, But Mysteries Remain
The universe began with the Big Bang, and is estimated to be approximately 13.7 billion years old (plus or minus 130 million years). Astronomers calculated this figure by measuring the composition of matter and energy density in the universe, which enabled them to determine how fast the universe expanded in the past. As a result, researchers could turn back the hands of time and pinpoint when the Big Bang occurred. The time in between that explosion and now makes up the age of the universe.



2. The Universe Is Getting Bigger


Dark Energy Mystery Illuminated By Cosmic Lens

In the 1920s, astronomer Edwin Hubble made the revolutionary discovery that the universe is not static, but rather is expanding. But, it was long thought that the gravity of matter in the universe would slow this expansion or even cause it to contract. In 1998, the Hubble Space Telescope studied very distant supernovas and found that, a long time ago, the universe was expanding more slowly than it is today. This puzzling discovery suggested that an inexplicable force, called dark energy, is driving the accelerating expansion of the universe.
While dark energy is thought to be the strange force that is pulling the cosmos apart at ever-increasing speeds, it remains one of the greatest mysteries in science because its detection remains elusive to scientists.



 3. The Universe's Growth Spurt Is Accelerating

Einstein's 'Biggest Blunder' Turns Out to Be Right
Mysterious dark energy is not only thought to be driving the expansion of the universe, it appears to be pulling the cosmos apart at ever-increasing speeds. In 1998, two teams of astronomers announced that not only is the universe expanding, but it is accelerating as well. According to the researchers, the farther a galaxy is from Earth, the faster it is moving away. The universe's acceleration also confirms Albert Einstein's theory of general relativity, and lately, scientists have revived Einstein's cosmological constant to explain the strange dark energy that seems to be counteracting gravity and causing the universe to expand at an accelerating pace. 
Three scientists won the 2011 Nobel Prize in Physics for their 1998 discovery that the expansion of the universe was accelerating.




4. The Universe Could Be Flat

Deformed Galaxies Confirm Universe's Acceleration
The shape of the universe is influenced by the struggle between the pull of gravity (based on the density of the matter in the universe) and the rate of expansion. If the density of the universe exceeds a certain critical value, then the universe is "closed," like the surface of a sphere. This implies that the universe is not infinite but has no end. In this case, the universe will eventually stop expanding and start collapsing in on itself, in an event known as the "Big Crunch." If the density of the universe is less than the critical density value, then the shape of the universe is "open," like the surface of a saddle. In this case, the universe has no bounds and will continue to expand forever.
Yet, if the density of the universe is exactly equal to the critical density, then the geometry of the universe is "flat," like a sheet of paper. Here, the universe has no bounds and will expand forever, but the rate of expansion will gradually approach zero after an infinite amount of time. Recent measurements suggest that the universe is flat with roughly a 2 percent margin of error.


5. The Universe Is Filled With Invisible Stuff

Hubble Reveals Ghostly Ring of Dark Matter


The universe is overwhelmingly made up of things that cannot be seen. In fact, the stars, planets and galaxies that can be detected make up only 4 percent of the universe, according to astronomers. The other 96 percent is made up of substances that cannot be seen or easily comprehended. These elusive substances, called dark energy and dark matter, have not been detected, but astronomers base their existence on the gravitational influence that both exert on normal matter, the parts of the universe that can be seen.



6. The Universe Has Echoes of Its Birth

This all-sky image of the cosmic microwave background, created by the European Space Agency's Planck satellite, shows echoes of the Big Bang left over from the dawn of the universe.


 The cosmic microwave background is made up of light echoes left over from the Big Bang that created the universe 13.7 billion years ago. This relic of the Big Bang explosion hangs around the universe as a pocked veil of radiation.

The European Space Agency's Planck mission mapped the entire sky in microwave light to reveal new clues about how the universe began. Planck's observations are the most precise views of the cosmic microwave background ever obtained. Scientists are hoping to use data from the mission to settle some of the most debated questions in cosmology, such as what happened immediately after the universe was formed. 



7. There May Be More Universes




The idea that we live in a multiverse, in which our universe is one of many, comes from a theory called eternal inflation, which suggests that shortly after the Big Bang, space-time expanded at different rates in different places. According to the theory, this gave rise to bubble universes that could function with their own separate laws of physics. The concept is controversial and had been purely hypothetical until recent studies searched for physical markers of the multiverse theory in the cosmic microwave background, which is a relic of the Big Bang that pervades our universe.
Researchers searched the best available observations of the cosmic microwave background for signs of bubble universe collisions, but didn't find anything conclusive. If two universes had collided, the researchers say, it would have left a circular pattern behind in the cosmic microwave background.

Sunday 29 September 2013

Microsoft Surface Pro 2

Microsoft Surface Pro 2: A specifications review 

Microsoft Surface Pro 2
After Google and Apple, Microsoft has made its “big” announcement for the month. The company has launched its new Surface tablets running on the latest Windows 8.1 and Windows RT 8.1 OS updates. The Surface Pro 2 is the revamped version of the Surface Pro and runs on Windows 8.1. Let’s take a look at the specifications to find out what changes have Microsoft introduced with the new iteration.

OS – Windows 8.1
The Surface Pro 2 runs on the newest Windows 8.1 update, codenamed “Blue”. While the 8.1 update has created some ripples due to its modern-looking UI, the update does a lot to fix the problems that Windows 8 was launched with. The biggest and a welcome change is that Microsoft is bringing back the good ‘ol “Start” button. The 8.1 update will come with a host of new features, apps, new highlights and more. Some of which are a better on-screen keyboard, new alignment features as users can open two or more windows side by side, Internet Explorer 11 and UI scaling at up to 200 percent. One of the apps pre-installed in the device is Skype and the Surface Pro 2 tablet owner will also receive a year of free international calls and access to Skype public Wi-Fi hotpsots.

Display: 10.6-inch ClearType capacitive touchscreen with 1080p
Microsoft hasn’t brought anything new with the display. The Surface Pro 2 gets the same 10.6-inch ClearType touchscreen display as seen in the previous model. The resolution hasn't been upgraded, and you’ll find the same 1920 x 1080 pixels. But we're happy with a full-HD display. The company has also employed a screen panel with 46 percent better color accuracy. All in all, we don’t think the screen would disappoint.

The new Surface Pro 2
The new Surface Pro 2


Form-factor – Not much change here
Not much has changed in terms of the design of the Surface Pro 2 compared to its predecessor, except for the new kickstand. The built-in kickstand is definitely and the-only stand-out feature (no pun intended). It supports two positions, allowing the tablet to either stand upright, or at a tilted 45 degrees. The build quality is maintained well with the "VaporMg" magnesium casing. The tablet hasn’t got any lighter as it still weighs the same 907 grams and measures 27.46 x 17.30 x 1.35mm.

Though there aren’t many changes in the design, the tablet gets a bevy of new accessories. It gets the Power Cover to help boost the battery life and Music Kit, a simplified keyboard with bigger buttons to be used with sound recording software. Then, there’s the really cool Docking Station, which is basically a port replicator (and extender) for your Surface Pro 2. The Docking Station features four USB ports, one Mini DisplayPort, 3.5mm audio in and out sockets and an Ethernet socket.

Processor
The Microsoft Surface Pro 2 is powered by the fourth-generation Intel Core i5-4200U "Haswell" processor, clocked at 1.6 GHz. However, it is capable of running at up to 2.6GHz, owing to Intel's Turbo Boost tech. It also gets Intel HD Graphics 4400 which helps deliver 50 percent better graphics performance compared to its predecessor. All this has been coupled with 4GB of RAM, while higher 256GB and 512GB configurations will get  8GB of RAM. The company claims that the Surface Pro 2 is faster than 95 percent of today's laptops. However, we’ll know that only after some benchmark test results are out.

Internal storage – Two options with microSD card slot, cloud and SSD options
The Surface Pro comes with either 64GB or 128GB of  internal flash memory and 256 GB and 512 GB of SSD storage. Users can further expand the memory using the microSD card slot and even the 200GB free SkyDrive space that the company is offering. Overall we're quite satisfied with the storage options the Surface Pro comes with. It allows us to dream of this device replacing our laptops.


Camera – 1.9MP rear and front cameras
Like the last time the cameras on the Surface Pro are a disappointment. Microsoft has employed the same 1.9MP cameras on the front and back, capable of taking 720p videos. Not too bad for video conversations, but do not expect the tablet to be great at capturing stills. 

The kickstand
The kickstand


Sensors, GPS & Connectivity

 The Surface Pro 2 comes with the usual suspects, an accelerometer, gyro and a compass and nothing more. About the GPS, Microsoft has been silent as to whether it supports A-GPS or GLONASS. The tablet supports dual-band Wi-Fi 802.11 a/b/g/n, but again it isn’t clear whether it will support the "ac" draft. Wi-Fi Direct can be used to share files with other phones on the same network. The big hole in Microsoft's announcement last night was no mention of whether the Surface Pro 2 comes with either 3G or LTE connectivity.

Battery

The battery life was one of the sore points of the original Surface Pro which lasted roughly 5 hours. However, the company has made some correction here as the Pro 2 is said to offer 75 percent longer battery life. The Pro 2 gets the same 4,200mAh battery, but supposedly there’s some better battery management introduced. The new Power Cover, which is a keyboard cover with a battery pack, allows extending it further.

Bottom Line
Visually, the Pro 2 is quite similar to its predecessor. Apart from sharper display, better battery and improved performance, there aren’t many surprises that Microsoft has added with its new tablet. Instead of new features the company has focused on a bevy of accessories that can be used as enhancements.  For instance - Power Cover helps increase the battery life, Docking Station acts like a desktop replacement. Clearly, if Microsoft is hoping that with the current iteration of the Surface Pro 2 it takes steps towards becoming a laptop replacement, not a plan many of its OEM partners will be happy with.

The Surface Pro 2 will be available for pre-order from September 24 on the official Microsoft online store starting from $899 (Rs 56270 approx).