Tags: Cool Gadgets, unique gadget
Cool Gadgets, Multimedia, Other Gadgets,
Written on Sunday, June 08, 2008 by Gemini
A new version of the open-source Mozilla Firefox Web browser is scheduled for release on Tuesday, June 17, with improvements in security, speed and design. Many of the enhancements in Firefox 3 involve bookmarks. The new version lets Web surfers add keywords, or tags, to sort bookmarks by topic. A new “Places” feature lets users quickly access sites they recently bookmarked or tagged, and pages they visit frequently.
There’s also a new star button for easily adding sites to your bookmark list – similar to what’s already available on Microsoft’s Internet Explorer 7 browser. Other new features include the ability to resume downloads midway if the connection is interrupted and an updated password manager.
In a nod to the growing use of Web-based email, the browser can be set to launch Yahoo’s service when clicking a “mailto” link in a Web page. Previously such links could only open a standalone, desktop email program. Firefox will also block rather than simply warn about sites known to engage in “phishing” scams that trick users into revealing passwords and other sensitive information.
Microsoft is currently testing Internet Explorer 8, while Opera Software ASA recently released Opera 9.5.
Posted in Evolution, Interesting, Internet, Microsoft, New software, Software Vendor |
Written on Sunday, June 01, 2008 by Gemini
SEO Marketing is a technique used by professional who want to get your web page ranked high on the search engines. When you choose to implement to your web page with the proper SEO Marketing techniques, you will quickly see the improvements to your customer base and your business. Follow the step by step guide that many SEO Marketing teams supply you and soon you will find that you have the high ranking web page that brings you the business that you want. Professional SEO Marketing Choosing to upgrade your web page to a highly optimized advertising machine is something that many business owners strive for. In order to do this, you should always get the best advice possible. By working with the professional SEO Marketing team you find here, you will be able to gain the level of web presence that you need for optimal results.
SEO Marketing works on a number of different levels. Educating yourself about some SEO Marketing will help you when it comes time to ask questions. Asking the right ones will allow you to be more involved in the process and getting the right SEO plan for you. SEO Marketing Get the SEO marketing that you need when you sign up with a marketing firm that brings you the right information and the end product that you want right on time, every time. When you are ready to build your client base and raise your level of visibility on the search engines, you can do so by getting the right SEO Marketing firm to back you. Check out the information that you see before you right here.
Posted in Search engines |
Written on Saturday, May 31, 2008 by Gemini
Dating is easy when you sign up for a membership in a dating site. With technology as fast and easy as it is, you can find the love or your life, or at least a solid friend, when you allow yourself to meet new people. Dating has never been easier and when you check out the services that are provided to you on this site, you will be able to increase your chances of meeting that special someone now.
Dating sites have changed Dating sites have changed. Now, more than ever, you can option to sign up without having to commit to a relationship. Many people are simply looking to meet someone that they can connect with, be it a romantic level or platonic. Get the information that you want when you are seeking the right person for you. On this dating site, you will gain the information that you need. Safety, efficiency, and privacy are all important when seeking the right kind of mate.
Dating online Increase your chances of meeting that special someone right now. Dating has never been easier and when you check out the services that are provided to you on this site, you will find that you are more than satisfied with the outcome. Get the information that will protect you and give you opportunities to date the right type of people for you. Get started today.
Posted in General research, Life Saving Research, Love |
Written on Thursday, May 29, 2008 by Gemini
In recent times, the world's leading technology universities are opting in for a different business model, for the sole need of survival. Earlier they were involved only in research & development, and were least focused on the business front. The things started changing when investors in these institutions reduced their investments and hence these R&D centers started looking concentrating on business as well. That's why you'll see many R&D institutes going out in the market selling their research. Many leading technology institutes are in this league. However, in my opinion, Yissum - the technology transfer services company of the Hebrew University of Jerusalem - is in the lead position through its highly appreciated university technology transfer programs. It is responsible for marketing the inventions and know-how generated by the University's renowned researchers and students.It has expertise in diverse domain from nanotechnology to medicine and pharmaceuticals, agriculture and nutrition, water and environmental technologies to computer science to homeland security. Speaking in terms of numbers, Yissum has granted 400+ technology licenses and is responsible for commercializing products of over $1 billion in worldwide sales every year. Over the past 40 years, Yissum has churned out many popular products like Exelon, Doxil, Superior ceramic ink, UV pearls, Sumo - louse repellent, and Ram onion. There are many successful companies that are born at Yissum including Algen Biopharmaceuticals, Avian Tech, Ester Neurosciences, HumanEyes Technologies, and many more. Yissum has won many patents (and many more are pending) on various technologies namely Single Image Dehazing, Prevention of Age-Related Retinal Deterioration, New Biosensor for Nerve Gases, Drought-Tolerant Trees, and Process for Producing Organic Ultra Thin Films. I am mesmerized by the wide array of their research in technology field - Improved Cache Performance with Reduced Energy Consumption; Compiler Aided Ticket Scheduling; Markov Model Application for DNA-Arrays and Gene Identification just to name a few. If you want to involve yourself in the world's leading technology institution and contribute to the award winning technologies / products, I recommend Yissum - a true technology transfer company. Posted in Communication research, Dangerous research, Energy research, General research, Health research, Interesting, Life Saving Research, Nanotechnology |
Written on Sunday, May 25, 2008 by Gemini
In a world first, researcher builds a robot that moves like a human; may help create rehabilitation techniques for those who can’t walk.
In the world of science fiction movies, robots move around with ease as they walk, run and jump, just like humans. However, in reality, getting an automaton to emulate man’s gait is a very complex activity.
Now, researcher Daan Hobbelen of Netherland-based Delft University of Technology (TU DelfT) has developed a new, highly-advanced walking robot: Flame. This type of research, for which Hobbelen will receive his PhD, is important as it provides insight into how people walk. “This research could help people with walking difficulties through improved diagnoses, training and rehabilitation equipment,” Hobbelen notes in his study.
Energy Efficient & Stable
If you try to teach a robot to walk, you will discover just how complex an activity it is, the researcher says. While walking robots have been around since the 1970s, there are two strategies applied for them. The first derives from the world of industrial robots, in which everything is fixed in routines, as is the case with factory robots. Although this approach can produce excellent results, there are major restrictions with regard to cost, energy consumption and flexibility.
The other method for constructing walking robots – which Hobbelen used – examines the way humans walk. This is really very similar to falling forward in a controlled fashion. Adopting this method replaces the cautious, rigid way in which robots walk with the more fluid, energy-efficient movement used by humans.
What makes Hobbelen’s research unique is that this is the first time a robot has been demonstrated to be both energy-efficient and highly stable. Hobbelen’s breakthrough came in inventing a suitable method for measuring the stability of the way people walk. This is a remarkable feat, as ‘falling forward’ is traditionally viewed as an unstable movement.
Next, he built a robot with which could demonstrate the improved performance: Flame.
Flame contains seven motors, an organ of balance and various algorithms which ensure its high level of stability. For instance, the robot can apply the information provided by its organ of balance to place its feet slightly further apart, in order to prevent a potential fall. “Flame is the most advanced walking robot in the world, at least in the category of robots which apply the human method of walking as a starting principle,” Hobbelen says.
Written on Tuesday, May 20, 2008 by Gemini
Washington: A US researcher has found that video games – with the power to energise players and induce a positive mood – may help increase a person’s creativity.
S Shyam Sundar, a professor of Film, Video and Media Studies at Penn State University undertook the study with a graduate student named Elizabeth Hutton with a view to understanding the value of video games as a vehicle for sparking positive social traits, such as creativity.
During the study, 98 college students were asked to play a popular video game known as ‘Dance Dance Revolution’, at various levels of complexity. The students took a standard creativity test after playing, and the researchers also asked them whether they were feeling either positive or negative after the game.
Upon a statistical analysis of the two emotional variables and the students’ creativity scores, the researchers found two totally different groups with high scores. The researchers noted that players with a high degree of arousal and positive mood were most likely to have new ideas for problem solving.
They also observed that creativity scores were highest for players with low arousal and a negative mood.
According to the researchers, their findings appear to show that either high or low arousal is key to creativity, and that medium amounts of arousal are not conducive to “thinking out of the box”. “When you are highly aroused, the energy itself acts as a catalyst, and the happy mood acts as an encouragement. It is like being in a zone where you cannot be thrown off your game,” said Sundar, who is also a founder of the Penn State Media Effects Research Laboratory.
He says that a negative mood, especially when there is low arousal, brings a different kind of energy that makes a person more analytical, which is also crucial to creativity. Sundar also said that video games can be used in classrooms to energise students and improve their creativity, and in companies to improve corporate decision-making.
Written on Sunday, May 11, 2008 by Gemini
Researchers at the US Department of Energy’s Argonne National Laboratory have developed a chip that can save lives by diagnosing certain cancers even before patients become symptomatic. A tumour – even in its earliest asymptomatic phases – can affect proteins that find their way into a patient’s circulatory system. These proteins trigger the immune system to kick into gear, producing antibodies that regulate which proteins belong, and which do not.
-- Picture: Argonne biologist Daniel Schabacker prepares to load a biochip onto a scanner. The biochip (below) contains grids of small wells or ‘dots’, each of which contains a protein, antibody or nucleic acid, which helps detect cancer. --
The new technology, known as a biochip, consists of a 1x1 cm array that comprises anywhere between several dozen and several hundred ‘dots’, or small drops. Each of these drops contains a unique chemical that will attach itself to particular proteins that could be cancer tell-tales. “Antibodies are the guardians of what goes on in the body,” said Tim Barder, president of US-based Eprogen, Inc, which has licensed Argonne’s biochip technology. “If a cancer cell produces aberrant proteins, then it’s likely that the patient will have an antibody profile that differs from that of a healthy person,” he added.
In their hunt for cancer indicators, Eprogen uses a process, which sorts thousands of different proteins from cancer cells by both their electrical charge and their hydrophobicity or “stickiness.” The process creates 960 separate protein fractions, which are then arranged in a single biochip containing 96-well grids. Scientists then probe the microarrays with known serum or plasma “auto-antibodies” produced by the immune systems of cancer patients.
By using cancer patients’ own auto-antibodies as a diagnostic tool, doctors could potentially tailor treatments based on their personal auto-antibody profile. What makes this technique unique is that scientists can use the actual expression of the patient’s disease as a means of obtaining new and better diagnostic information that doctors could use to understand and fight cancer better. Biochips have already shown promise in diagnostic medicine and are useful in rapidly and accurately detecting other diseases, said Argonne biologist Daniel Schabacker, who developed the technology. Posted in Health research, Life Saving Research, Nanotechnology |
Written on Monday, May 05, 2008 by Gemini
Emotion-detecting robot cars will face off against eavesdropping flying saucers in the English countryside later this year, as scientists and school children compete with their designs for the next generation of military equipment. It’s the British Ministry of Defence’s first ever “Grand Challenge’’, aimed at encouraging scientists, inventors and academics to turn ideas into machines for army use in urban environments.
It gave six finalists funding to build machine prototypes, such as mini-helicopters and disc-shaped flying robots fitted with heat and motion sensors that can be controlled remotely from a bunker. And the finalists, who each received 3,00,000 pounds (Rs 2.4 crore), came to London last week to display their models.
“This project has really allowed us to broaden our vision and look at what other work is being done out there in our field,’’ said Norman Gregory, business manager for the Silicon Valley Group PLC, a small research company in southeast Britain. His company teamed up with the Bruton School for Girls in Somerset to build an unmanned buggy that can analyse gunmen’s movements to determine whether they are angry or nervous. “We are a small company and would not have been able to put together a consortium to develop such a sophisticated system without this competition. The government made it clear it wanted consortiums to get schools involved, and since the Bruton school already ran its own robot design competitions, we asked them if they wanted to have a look at our research,’’ Gregory said.
Another group, Swarm Systems Ltd, has built a set of tiny helicopters that fly in formation into a village, recording images and audio tracks to beam back to headquarters. Finalists will take part in a mock battle in August in Copehill Down, a village that was modelled on an East German one when it was built for military training during the Cold War. Copehill Down is near Stonehenge, about 150 kms from London. The contestants will have their machines search for pretend gunmen and mock bombs, earning points for each find and losing points for hitting civilians or transmitting data too slowly. The winner gets a trophy made from the recycled metal recovered from a WWII fighter jet. The best designs will also get financial backing from Britain’s defence ministry. Posted in Gadgets, General research, Products, Robotics |
Written on Tuesday, April 29, 2008 by Gemini
The economy is headed into recession, if it isn't there already, and IT budgets are feeling the pinch. But that doesn't mean companies are putting their business intelligence (BI) plans on hold, especially if those plans involve open source software. Just last month, open source BI vendor JasperSoft Corp. recorded its 80,000th deployment, making it the world's most widely used BI software, according to the company. Nearly 20,000 developers have accessed BIRT Exchange, the open source BI community site sponsored by Actuate Corp. And Pentaho Corp. recently raised $12 million in funding, indicative of investors' confidence in open source BI.
With the cost of a typical commercial BI software deployment reaching well into six figures, open source BI software is an attractive option for many cash-strapped businesses and offers them a less expensive way to tap into the power of their data. And with a community of developers regularly adding code, new and customizable open source features emerge more frequently than do those of their commercial counterparts.
But open source doesn't mean free, and companies considering it still need to set aside budget dollars to cover maintenance and support fees.
"Open source is coming on," said an analyst. "There's interest in it and companies are growing more comfortable with it. In fact, research we did last year showed that people didn't have any [reservations] with open source business intelligence."
Open source in a tight economy
An economic downturn, in fact, may actually prove to be a boon for open source BI vendors. CIOs regularly highlight BI as a top priority, but with fewer resources, buying expensive software from commercial vendors like Business Objects and Cognos is difficult to justify. Investing in open source BI software, meanwhile, is a much easier sell.
But the benefits extend beyond a lower price tag.
Downloading and installing open source BI software, for one, is usually a quick proposition. Actuate's iServer Express, an open source report server for its BIRT Eclipse reporting tool, can be deployed in under an hour, according to Vijay Ramakrishnan, marketing director for the San Mateo, Califoernia-based software maker's Java group. Just try that with a commercial BI offering. A large and active community of developers, both outside and within the vendors themselves, also means the upgrade cycle for open source BI software is significantly shorter than it is for commercial offerings, which sometimes last for years.
And the open source model makes customization easier. A company can deploy an open source BI system, gauge user reaction, then work with its own developers and the developer community at large to reshape the software to satisfy its particular needs. Commercial software can also be tailored, but the process is usually more cumbersome, as the code needed to make changes is not open to outside developers and can only be customized by the vendors themselves.
Posted in Analyst report, Business Intelligence |
Written on Friday, April 25, 2008 by Gemini
At home your family’s waiting… At work, your boss is frowning as usual… But you have not yet finished working on the file in your desktop.
This is a quite a familiar scenario with most of the working people. How often have we heard them wish, ‘if only we could complete this at home’… Accessing from home the files which are saved in your office computer was just a dream till now. However, it is now possible to access files saved in any computer from any other computer!!!
GoToMyPC allows you to remotely access your computer (assuming it is connected to the Internet) from any other computer (connected to the Interner) in the world. This means that one can seamlessly access / send e-mails, work on projects, documents, or network resources, without physically present in that location! This site offers simple yet powerful software that has to be installed in both the computer you wish to access and the computer you would be accessing it from. The process of registration is quite simple and post-registration, you can choose from the personal, premium and corporate options. While the personal option allows you to access two remote computers, the premium enables access to five computers and you can access the whole office if you subscribe for the corporate plan.
The corporate account is a conscious effort by the company to help its employees access their office work from a remote computer. Don’t worry about the security! This software is quite secured and can be used with the existing firewall setup without disturbing its integrity. An access code resides on the host computer and is never saved onto the website’s servers, so no more leaks. You’d also get notified if someone is trying to break in. There is a built-in lockout protection where you can even disable the keyboard and screen in such cases. This is just like virtual private network (VPN) that has been in use in many companies. However GoToMyPC is the first third party VPN service that’s being used. It is a convenience tool that many software companies are using so their employees can look at their files from home.
Try GoToMyPC now!!! Posted in Communication research, Gadgets, New software, Products |
Written on Tuesday, April 22, 2008 by Gemini
Green networking = Efficient networking.
Efficient network design combines improvements in consumption and consolidation for increased manageability and lower lifecycle cost. Here you’d learn more about the specifics of efficient network design you can implement to keep your network green, as well as avoiding wastes of bandwidth, power and budget.
In recent years, the call for "green" has grown louder. We hear it in the news and see it on billboards and in magazines - and, unfortunately, feel it in our pocketbooks (it cost $65 to fill up today). Regardless of your political affiliation or environmental beliefs, it's impossible to deny this fact: The cost of energy is increasing. As consumers, we feel the results of our inefficiencies in our daily budgets. As individuals responsible for designing network architectures, our employers feel those inefficiencies in their operating costs.
Why is that important?
In the late 1990s, when the Internet bubble burst, the companies that survived were those that found a way to become efficient. These same businesses are now looking at ways to further increase their efficiencies without cutting their workforce -- and that includes every aspect of how they think and operate. In this article, I'm going to outline some elements contributing to this "green wave" as it relates to network design.
The "green" factor
What does it mean to be green?
It depends on who you ask! Efficiency is a broad term, especially in network architectures, but there are several key elements:
Each of these elements is related, and their synergies create the semblance of a "total system." The purpose is to show that there are in fact different shades of green, and though it may be possible to create a design that encompasses all of these factors, benefits can result from focusing on just one.
From a design perspective, there are really two elements that can be thought of as inputs to network design:
"Consumption" is the broadest of terms used most often to describe the power and space usage of network elements such as servers, routers, switches, firewalls and SANs. There are, however, other points that can be related to this term, but they aren't as easy to differentiate.
Consolidation is a distinct design option that can mitigate your consumption issues and provide an avenue for increased manageability - and subsequently decrease your cost of support. Here are a couple of technologies that consolidate infrastructure:
Virtualization (includes server, firewall, SAN, routers, switches, desktops)
Chassis-based installation (FWSM, WSM, RSM, VPNSM, etc.)
The true trick to "getting green" is applying the principles without sacrificing these factors, or you risk losing the gains forged within the design itself.
Results
Lower consumption through consolidation results in increased manageability and lower lifecycle cost - or a "green(er)" infrastructure! The desired result of instilling some of these principles into the minds of engineers is that organizations can start taking advantage of savings gained through efficiencies.
Posted in Analyst report, Go Green, Networks |
Written on Saturday, April 19, 2008 by Gemini
American Apparel makes RFID sexy...
Retailer American Apparel Inc., known for its risqué ads and "Made in Downtown LA" label, is putting radio frequency identification tags (RFID) on every Boy Beater tank, Baby Rib brief, Cross-Back bra and Sleeve Ringer T-Shirt in its 17 New York metropolitan area stores. That's 40,000 items per store, each tagged with a high-tech chip, starting with the Columbia University location in Manhattan. The company is using Vue Technology's TrueVue software to manage the RFID data, Motorola Inc. RFID readers and antennae to capture the data and Avery Dennison Corp. tags to locate and store that data at the item level. Los Angeles-based American Apparel is on an aggressive timetable to roll out the sophisticated inventory tracking system to an additional 120 stores in North America.
The question is why. RFID tagging on crates -- never mind individual pieces -- is currently on the radar for only a handful of retailers, mainly behemoths like Wal-Mart Stores Inc. and its legions of suppliers now under orders to adopt transponder tags or else.
RFID is tricky, said John Fontanella, an analyst at AMR Research Inc. in Boston. "The use cases for RFID are not as obvious as some proponents would have you believe," he said. Automating a process like taking inventory changes the way people work, and that "requires a significant amount of re-engineering," Fontanella said, and up-front labor. There's little doubt of that, said Zander Livingston, RFID technology director at American Apparel. The manual tagging for a single store alone required multiple employees working three full days. But for American Apparel, with its numerous, nearly identical styles in a rainbow of colors, the technology makes a lot of sense.
"The No. 1 reason is inventory accuracy," Livingston said.
American Apparel differs from a lot of other apparel companies, Livingston said, in that it displays one -- and one only -- of each size, style and color of a particular item on its sales floor. Each item in each of its color and size variations has a place on the salesroom floor, but the company does not load up the racks with multiples of the same kind. "So basically, as soon as an item has been taken off the rack to be tried on, or purchased or just carried around the store as the customer is browsing, the item is no longer available. We also have a lot of items that look similar to each other, and because of that a lot of items get misplaced," Livingston said. The high sales volume often means that more than 1,000 items a day are moving back and forth between stockrooms and salesrooms.
"We'd have 10% of the items lost in the stock room that needed to be on the sales floor. Part of my goal was to make sure that we had a perfectly fitted sales floor, at 100% capacity," said Livingston, an old classmate of American Apparel CEO Dov Charney from their prep school days at the elite Choate Rosemary Hall, who was recruited by Charney to put RFID to work. Accuracy was a big problem. Taking manual count of 10,000 items on the sales floor and the 30,000 stocked in the basement means having to ask employees to come early or stay late. Fatigued employees might glance at the tag in the collar, but because of the similarity in styles, misidentify the item. RFID accuracy is 100%.
Tom Racette, director of RFID market development for Mortorola's Enterprise Mobility Division, said RFID is becoming easier to implement. "A couple of years ago, this was all about the big challenging implementation, putting RFID across the supply chain. We are seeing more and more retailers, and businesses of all kinds, who are finding creative ways of implementing RFID in ways that are providing value," Racette said. Dr. Bill Hardgrave, professor of information systems and executive director of the Information Technology Research Institute at the University of Arkansas, said in a statement, "We've noticed an increasing trend among retailers that are implementing RFID at the item level, and American Apparel is a prime example of a retailer on the forefront of this trend."
By deploying the technology in additional stores, American Apparel expects to increase sales and customer service by having real-time visibility into products at nearby stores, enhancing the intrastore transfer process to balance stock, Racette said. Furthermore, the retailer will be able to respond more efficiently to market behavior by using RFID to record and report on purchases, not only within one location, but also across a region of stores.
For CIOs, they are looking at RFID and seeing that inventory accuracy is going from 80% to 99%, as it did in the American Apparel pilot, and that RFID is reducing the cost of managing inventory substantially, said Chris Schaeffer, director of RFID product marketing at Motorola. "CIOs are tasked with figuring out how to utilize the IT infrastucture to bring business value, and this can improve processes and efficiencies. It's good for a CIO to be able to say, 'See how this technology investment helped me do that,'" Schaeffer said.
Livingston did not give out a dollar figure for the investment, except to say the cost was about equivalent to the salaries of two full-time sales employees per store. The project is not without challenges, he cautioned. The deployment takes a "little more human interaction" than perhaps initially anticipated to verify everything being transported from the stockroom to the sales floor. He had to re-evaluate certain work movements so RFID readers were not blocked. It can be difficult to read tags around metal, for example. Down the road, he wants to design portals to capture theft -- people walking out the door with stuff -- but the "gates" have to be "aesthetically pleasing" so they are not in odds with American Apparel's meticulous stores. Livingston said the next step is to cut down on the manual labor by tagging items at the manufacturing plant. "I'm not going to tag another item in New York until there is source-level tagging."
Source: SearchCIO article
Posted in Communication research, Energy research, Gadgets, Interesting, Nanotechnology, RFID |
Written on Tuesday, April 15, 2008 by Gemini
Gartner has embarked on a wide-reaching new study of Google and its potential impact on IT, enterprise businesses, and society in general in the coming years. On April 10 at the Gartner Symposium ITxpo 2008 in Las Vegas, Gartner Vice President Richard Hunter revealed some of the first data points from this study.
The two most interesting points were:
1.) The best way to think of Google is as a disruptive technology.
2.) Disruptive technologies create big losers and big winners, and one of the biggest losers in the Google disruption could be traditional IT departments.
Google’s “Data Layer” includes both internally stored and externally accessed sources (Source: SRS, Google Analysis by Gartner)
· ..- Google knows (almost) everything that is connected to the Web
· ..- Google knows 67% of all Web searches
· ..- Google knows 1% of what is sold on the Web
· ..- Google knows the traffic to over 1.5 million Web sites
· ..- Google knows the physical locations of many things
· ..- Google knows the status of your machine if you install Google apps
· ..- Google knows the behavior patterns of Google registered users
· ..- Google is trying to know the physical location of any cell phone user who has installed Google apps or accesses Google services from the phone
Google as a disruptive technology
This new study is being conducted by a team of 15 Gartner researchers, led by Hunter, and the full report will be published in mid-2008. The title of Hunter’s presentation at ITxpo was “What Does Google Know?” The answer to that question was even more sobering than I expected, as the slide below demonstrates.
Hunter added that Google will know a lot more about what’s sold on the Web if Google Checkout takes off, and could soon know a lot about medicine and health patterns if Google Health Records gets adopted.
The Gartner researchers have estimated that Google technology can address 100 exabytes of data (an exabyte is equal to a billion gigabytes). “Their infrastructure has unprecedented scale,” said Hunter, “and what is even more impressive is their ability to connect vast quantities of information… Google is sitting on the biggest pile of information that has ever been collected in the world.”
The reason why Gartner chose to characterize Google as a disruptive technology - rather than just an Internet search engine company - is due to the ambitions that Google has for all of that data and the potential impact that those ambitions could have on the technology industry.
“Where the previous [computing] paradigm has been about my computer, my technology, my stuff … Google is trying to deliver any information, anywhere, to anyone in the world, on any device,” said Hunter.
“Google’s paradigm is a different paradigm. It’s an open source paradigm… We’re about to see a war of paradigms.” Clearly, the leader of the “previous paradigm” and the counter-movement to Google is Microsoft.
However, we also can’t forget that the Google paradigm includes massive privacy concerns. Hunter noted that Google continues to struggle to find the right balance between privacy, security, and its legitimate business interests. The more data Google collects, the bigger and more valuable target it becomes for electronic criminals. That will also make it a bigger target for governments, politicians, and citizen groups.
Hunter stated, “We believe Google’s information security will be a political issue worldwide by the end of the year in 2010.”
Here are few other interesting quotes from Hunter’s presentation, based on the study:
- “Google transcends the limits of the traditional OSI stack.”
- “We don’t know how good Google’s information security is.”
- “Google doesn’t worry about resources. Google’s always got more resources.”
- “Ask not what Google will do to you. Ask what you can do with Google … Ask how much of your business you want to expose to Google.”
- “Above all, move fast, because Google is moving fast.”
Google’s disruption to IT
“Google is disruptive and disruptive technologies produce big winners and big losers,” Hunter said, “One of the big losers is potentially traditional IT departments.”
As part of his presentation, Hunter specifically noted a number of ways in which the Google revolution would disrupt the IT industry in general:
- Traditional database management vendors would be marginalized into handling only high value transactions
- Enterprises will co-opt Google’s approach to data management and Google could host the data
- Proprietary applications such as Microsoft Office would be “deeply threatened”
- Many application builders could start developing on top of the Google platform
- Collaboration services will take a big leap and Google could provide the platform
- Companies will take major parts of the IT infrastructure (e.g. e-mail, storage, and business intelligence) and source it to Google.
However, after the presentation I followed up with Richard to get further clarification on how IT departments could be significant losers in the Google disruption. Here was his response:
“Google has the potential to be the first-choice provider of many services that are now handled by internal IT organizations, starting with non-competitively-differentiating services such as email (which Google already provides to a number of enterprises), and ultimately including high-value-added functions and services such as business intelligence, mobile sales support, and others. Some IT organizations might consider it a boon to pass these functions on to Google so that the IT department can concentrate on very enterprise-specific competitively differentiating applications. IT organizations that measure their worth in terms of how much of the company’s IT needs they supply themselves will be less happy to see Google move in on their turf-and I do mean specifically that in many cases it will be an argument about turf, not enterprise value.
“An important question is: can Google provide the quality (e.g. reliability, availability, security, etc.) that enterprises-a more demanding market compared to individual consumers-require from their suppliers? Consumers are satisfied when the potential provider says ‘Of course!’ Smart enterprises demand certification from someone besides the provider. Providing that certification will be something new for Google. On the other hand, many IT organizations aren’t mature enough to provide proof of their own capabilities in terms of value for money, and so will have a difficult time proving superiority over any external provider, whether or not it’s Google”.
Bottom line for IT leaders
What Gartner is arguing is that Google’s database and data center magic is creating a massive cultural movement and a competitive advantage that is going to sweep away businesses and industries and transform the technology world. In fact, Gartner sees Google becoming so large and powerful from a data storage and access standpoint that it is going to attract scrutiny - and potential regulations - from governments.
While these predictions have legs, several of the trends are larger than Google. As far as IT departments go, there are two related trends that will transform IT over the next decade: utility computing and managed services. The utility computing model will allow IT departments to deploy only the computing capacity that is needed and to track it and charge it to the appropriate business unit, department, or project. That will allow IT to tie the value of technology much more closely to business decisions.
Some businesses won’t want to handle that type of IT internally and so they will outsource it to providers like IBM, Hewlett-Packard, EDS, and Verizon Business. It’s unclear whether Google will want to get into the managed services business, but it might make sense for them partner with vendors like the four mentioned in order to offer services such as e-mail, storage, and business intelligence.
In terms of Google’s technical advantage - part of which is tied to its sheer data center capacity - let’s not forget that the other two big data center builders, Microsoft and Yahoo, could tie the knot soon and became a much more potent threat to Google’s vision. That could especially be the case if Microsoft allows its new technology leader, Ray Ozzie, to drive Microsoft in a much more Google-like direction centered around cloud computing. It’s also not a given that what Google has created in the world’s largest and most effective database isn’t something that Microsoft will eventually catch up to and co-opt.
Nevertheless, Google is obviously on the leading edge many of the trends that are powering the next breaking waves in the technology industry, and the effects of these trends will fundamentally change the way corporate IT departments are organized, operated, and financed over the next decade.
Posted in Analyst report, General research, Google, Interesting, Software, Webservices |
Written on Monday, April 14, 2008 by Gemini
Buying business intelligence (BI) software can be a frustrating, difficult process. Expert Mark Whitehorn offers up the top 11 things BI buyers should -- and shouldn't -- consider.
Buying business intelligence (BI) software can be hard -- with technical evaluations, prioritizing requirements, getting the funding you need and avoiding political landmines, there's a lot to consider. But in my experience, these are the most important considerations for business intelligence software buyers. OK, so eleven is an odd number -- but in addition to the eight things you should consider, I wanted to cover three things you should not consider. In my experience, some people give far too much weight to certain issues that have little or no relevance when choosing a BI system, so it seemed valuable to list these as well.
1. Return on investment (ROI)
ROI is king. It's top of the list because it's the bottom line (if you see what I mean). We don't implement BI systems because they are trendy; we don't do it because the technology is fascinating. We invest the company's money in a BI system because we expect to get more money back, in terms of income or savings, than we invest. Of course, calculating the income/saving is often a major challenge, but it must not be ignored. All the remaining points essentially follow on from ROI.
2. User requirements
This could arguably be at the top of the list, but ROI got in first. There's no point building a BI system unless it delivers exactly what users are requesting/demanding -- so take the time to go through the requirements-gathering process with your business users, however painful it may be. Make sure you can deliver what people want, or just don't start -- a failed project helps no one (and certainly not your career path).
3. Ease of use
Traditionally, BI systems have been difficult to implement, set up, understand, drive – everything about them has been hard. The good news is that the situation is improving, so buy one that is easy to drive. Give serious consideration to ease of use for the end user, but also consider that the easier it is for your technical staff to build and deploy a BI system, the cheaper it will be to implement. The systems that are currently available vary hugely in ease of use -- so make ease of use a priority in all areas.
4. Existing expertise within the organization
Suppose your enterprise has a policy of using just one database engine and has developed a very experienced technical team on-site. If you buy your BI solution from the same vendor, you get double benefits. Almost certainly your staff will find the new tools easier to use, because of the family similarity that runs through products, and secondly, the staff will be happier. If you force them to use a product from a manufacturer they don't respect, they'll hate it on principle and blame it (and/or you) for everything that goes wrong. And they will make sure it does go wrong.
5. Compatible technologies
Notwithstanding the point made above, few vendors currently supply complete end-to-end BI systems. So, depending on your needs, you might not be able to source everything from one supplier. If that is the case, before buying any of the components, ask searching questions to ensure maximum compatibility with your existing infrastructure. All too often, individuals within the enterprise lobby for the purchase of a BI component without taking this into account. (I'm thinking here of, say, the finance officer who insists upon a particular analytical tool.) Compatibility lowers the cost of producing an integrated system (something that finance officer might appreciate, once you explain it).
6. Killer functionality
It may be that one BI software product alone offers a single piece of functionality that outweighs virtually every other consideration except ROI. I have no idea what that might be for your particular enterprise, but you'll know it when you see it (or your IT team will tell you about it, long and loud). It might be support for spatial data types, for example, allowing you to incorporate GPS data for tracking deliveries, or perhaps decomposition trees for innovative data visualizations. But sometimes, that one killer feature makes the whole investment worthwhile, as opposed to trying to get another product to do something that it really wasn't designed to do.
7. Data volume
How much data do you have -- and how much will you have in the future? If you're a large retail chain collecting point-of-sale data, you have lots of data. If you're a telecom company, you have lots of data. If you're NASA … and so on. Certain BI technologies do not scale well. In-memory querying is a case in point: It can be very effective with surprisingly large data volumes, but there are limits to what it can handle. Some software products (particular data mining algorithms, for example) scale badly. They may work well with a million rows, but with 10 million, they may run like a (slow) dog. Try to gauge data volume accurately and match it to software/hardware capabilities. Then make the vendor really prove to you that the software can handle it.
8. Hardware
The hardware available for BI covers a huge range:
- Commodity standalone boxes.
- Commodity boxes bolted together to form Massively Parallel Processing (MPP) arrays.
- Dedicated MPP machines.
Costs vary accordingly. If you under-specify the hardware or try to use the wrong hardware for your new BI software, your system will never perform optimally and the ROI will fail to appear.
The business intelligence software buying points that you should not consider:
9. Cost
Cost isn't important; it's return on investment that counts. It's better to invest $5 million and reap $30 million than to invest $2 million and reap nothing. (Best of all is to invest $2 million and reap $30 million, of course.) With the right calculations and a convincing business case, you should be able to prove this to the money people at your company.
10. Current source systems
Existing operational systems such as the finance, CRM and human resources systems are typically underpinned by a database engine. Just because you're using Engine X for transaction processing does not mean you have to use it for the new BI project, for the simple reason that the Extract, Transform and Load (ETL) tool essentially sits as a buffer between them. Any good tool will be perfectly capable of extracting data from any number of different source systems and transforming it into any flavor you like. This doesn't mean you should ignore the existing expertise in your company – see above – but, in terms of functionality, there is little need to consider the existing engine.
11. The sales pitch
I don't know how to break this to you -- but some salespeople make things up. They exaggerate, omit pertinent information and even lie. This is sad, but inescapable. At best, they often lack a technical grasp of the capabilities of the systems they are offering. It is essential to talk to technically competent people and get them together with your technically competent staff. In my experience, technical people are less likely to stretch the truth. This is not a hard-and-fast rule, simply an observation based on experience. I have, however, heard a technical guy say: "Don't use our Component Y – it's rubbish." I've yet to hear the same words from a salesperson.
Source: TechTarget report
Posted in Business Intelligence, Software |
Written on Monday, April 07, 2008 by Gemini
Source: Gartner report, 2008
Software-as-a-service (SaaS) customers increasingly need to integrate their internal applications directly with the software functionality available from SaaS providers.
Vendors should implement a portfolio approach to their multienterprise integration strategy to best meet the diverse needs of their target market.
Key Findings:
- Multienterprise integration is complex and resource-intensive.
- SaaS customers need to deal with SaaS integration just as they do for multienterprise integration with other external business partners.
- SaaS vendors that choose a one-size-fits-all approach to multienterprise SaaS integration are more likely to fail to meet the diverse requirements of SaaS customers.
- SaaS vendors can build or outsource their business-to-business (B2B) infrastructure.
Recommendations:
- SaaS customers should evaluate multienterprise SaaS integration from SaaS vendors the same way they evaluate multienterprise integration solutions from other vendors.
- SaaS customers should ask SaaS providers for details about multienterprise SaaS strategy and pricing, and whether their preferred method of integration is supported.
- SaaS vendors should offer a portfolio approach to multienterprise SaaS integration; those that can't do this unilaterally should partner to accomplish this approach.
- When doing integration with multiple business partners, including a SaaS vendor, SaaS customers should implement a portfolio approach to multienterprise integration.
The Multienterprise SaaS Integration Problem:
Although some SaaS-based software functionality can be delivered via the ubiquitous Web browser, in many cases, direct application integration between the software functionality of the SaaS provider and its customer's internal applications and systems are required. We refer to this scenario as "multienterprise SaaS integration." As is the case for internal application integration, the particular type of integration problem you are solving for multienterprise SaaS integration can vary.
For example, the problem you may be solving may be data synchronization, process integration or composite application integration (see "Three Forms of Interapplication Integration in Healthcare"). Regardless of which three integration problems you are solving for multienterprise SaaS integration, the approach can vary widely: for example, batch vs. real-time interaction; or flat files vs. electronic data interchange (EDI), XML or Web services. In addition to achieving basic multienterprise integration, SaaS vendors also need to support multienterprise process visibility (for example, a view of the business process spanning the SaaS provider's and SaaS customer's applications) and compliance (for example, monitoring and enforcing security and service-level agreements).
Alternative SaaS Vendor Strategies for Multienterprise Integration:
Whether large or small, SaaS vendors will ultimately choose one of three strategies for multienterprise SaaS integration:
- One size fits all
- Any way you want it
- Portfolio approach
Posted in Analyst report, SaaS |
Written on Tuesday, March 25, 2008 by Gemini
India to remain the fastest growing IT Services country in the region, while Greater China will represent the largest regional opportunity by 2011…
The IT Services market in Asia Pacific (excluding Japan) will grow from US$37.5B in 2007 to US$55.9B in 2011, representing a compounded annual growth rate (CAGR) of 10.5% from 2006 to 2011, according to the ‘Asia Pacific IT Services Market and Forecast, 2006-2011’ report by Springboard Research, a leading innovator in the IT Market Research industry. According to the report, the Indian IT Services market with a CAGR of 18.6% will remain the fastest growing in the region, although as a region Greater China will offer the largest market opportunity in dollar terms at the end of the forecast period.
“The Asia Pacific IT Services market is arguably the global leader in terms of growth, supplemented with a mix of mature and emerging markets,” said Phil Hassey, Vice President – Services Research at Springboard Research. “The markets of interest are not just the top four – China, India, Australia and Korea – but the emerging ones like Indonesia and Vietnam, which will register significant growth going forward,” Mr. Hassey added. The report uses Springboard’s Market Attractiveness Index to rank countries and individual IT Services markets on the basis of growth opportunities. According to the Market Attractiveness Index, the top ten countries in the region are:
1. People’s Republic of China
2. India
3. Australia
4. Korea
5. Indonesia
6. Vietnam
7. Malaysia
8. Rest of ASEAN
9. Singapore
10. Philippines
“For India and China, local capabilities, offerings and presence is just the start of a list of essential requirements for success. On the other hand, existing relationships, marquee clients and strong partnerships can provide capabilities for expansion in markets such as Hong Kong and New Zealand with relatively limited opportunities,” Mr. Hassey added. According to the report, Application Hosting with a CAGR of 19.5% between 2007 and 2011, will register the fastest growth during the forecast period, although Enterprise Application Integration at US$ 7.8billion will continue to be the largest component of the market by 2011. While Enterprise IT Outsourcing is the largest market in 2007, the reluctance of PRC firms to use the Enterprise IT Outsourcing model will reduce its relative size and weighting in the market by 2011.
As part of the report’s overall assessment of the APEJ IT Services market, Springboard Research has several key outcomes and predictions for the industry in 2008. The report predicts that challenges in accessing and retaining IT Skills will accelerate the shift to external services providers, as enterprises will struggle to retain in-house key individuals and skill sets. Also, China will not challenge India as the home of offshore service delivery especially for English language requirements – as skill levels, quality, culture and governance are all more suited to India being a hub of global delivery against the PRC.
About this report
Springboard Research ‘Asia Pacific IT Services Market and Forecast 2006-2011’ report offers an extensive and insightful perspective on IT Services market across Asia Pacific (excluding Japan) region. It outlines 15 individual IT Services markets – including Infrastructure Support, Desktop Management, Enterprise Application Integration and IT Outsourcing - and 15 countries with respect to market size, key players and growth dynamics and forecasts demand and growth for each of them. The report also contains predictions for the IT Services industry for 2008.
About Springboard Research
Springboard Research is a next-generation IT market research and advisory firm. Springboard leverages its pioneering research model to deliver greater agility and flexibility in IT market research and helps its clients lead rather than follow market trends. Springboard works with the leading IT companies in the world in the software, services, telecommunications and hardware sectors. Founded in 2004, Springboard has a worldwide presence with offices in the United States, Australia, Singapore and Japan, as well as global research centers in India, Pakistan, and Morocco. Springboard has been acknowledged as an emerging leader and was recently named ‘Rising Star’ in the global IT market research industry by Outsell, the leading research and advisory firm for the information industry. For more information, please visit www.springboardresearch.com
Posted in Analyst report, China, India, New software, Software |
Written on Tuesday, March 18, 2008 by Gemini
When Canadian company Research In Motion (RIM) launched BlackBerry in 1999, within no time the revolutionary mobile device that enabled users to browse the Net, read emails in real-time and send fax documents earned the nickname, CrackBerry, an allusion to its notoriously addictive features. Like all secure internet services, RIM uses an encryption code that scrambles the email messages sent out from a BlackBerry device and then unscrambles it again when the message reaches its target. Only, Blackberry uses a highly complex algorithm for the purpose — a 256-bit advanced encryption standard process. The Intelligence Bureau (IB) of The Government of India allegedly can decode messages with an encryption level of up to 40 bits. (According to cyber security experts, there’s a rigid decryption technology hierarchy in the world: The US has the most advanced software, Europe gets tech that’s one generation behind and countries like India have even older decoders.)
So, if intelligence agencies cannot crack BlackBerry’s email code, they can still do one of two things — get the government to force RIM to scale down its encryption code to 40 bits, or better still, ask for the “keys” that will unlock the code. The Section 69 of the IT Act, 2000 does give the government the power to intercept electronic information, but such sweeping surveillance is clearly stretching the law. And, what impact will it have on ecommerce? People will be extremely concerned about sending business details through the Net. For instance, the licensing norms for ISPs in India were created in 1998-99. Accordingly, licenses issued to ISPs forbid encryption above 40 bits. Today, a 40-bit code can be cracked in no time. A browser like Internet Explorer 7 has a 128-bit code. So, any web provider using an encryption of over 40 bits has to provide the keys to the government. This means that the government has the means to track transactions and correspondences in these websites — an access it doesn’t have in the BlackBerry platform since the ISPs providing these services were, for some reason, never asked to hand over the encoding key.
Terror organizations are constantly changing their footprint and upgrading their technology. Today if we have tracked say, 555 web-pages linked to the terror network, tomorrow they may all disappear and return modified. It’s a nightmarish scenario for security agencies. However, the powers of surveillance can be misused. That’s a devil you have to live with. Unfortunately, the legal and political framework needed to check misuse of cyber-snooping by our politicians is lacking in the country. That’s a point many cyber experts are making. Can the intelligence agencies ensure fairplay? People may be willing to give up some of their civil liberties for dealing with the security threat to the country. But there should be a clear-cut policy framework and laws on what kind of invasion is lawful and what’s not. Clearly, there’s room for legislative action and transparency in cyberspace.
SHORT CUT TO ENCRYPTION
What? – In IT, encryption is a software that uses advanced algorithms to scramble a message being sent out in cyberspace. The message is unscrambled when it reaches the recipient.
Why? – It’s a security measure to prevent internet data being read by unintended persons.
Who Uses It? – All web browsers, e-commerce, banking sites and email service providers use encryption software to ensure secure transactions and confidentiality.
Does It Change? – Yes, encryption technology is constantly evolving. A few years back, codes that were 40 bits long were considered safe. Now, 128-bit codes are default in most sites. BlackBerry uses a more advanced 256-bit algorithm.
Can It Be Cracked? – Software can be developed to crack encryption codes. Security agencies use these to monitor data flow in cyberspace. Obviously, longer codes are harder to crack.
What Is A Key? – This is the sequence of bits used by an encryption algorithm to scramble a message and put it back again. It unlocks the code.
Source: Sunday Times, March 16, 2008
Posted in Communication research, India, Mobile, Security |
Written on Tuesday, March 11, 2008 by Gemini
China leads the pack with SOA integration dominated by local players…
Singapore, March 11, 2008: Springboard Research, a leading innovator in the IT Market Research industry, today reported that local System Integrators (SIs) and Independent Software Vendors (ISVs) are playing a significant role in SOA vendors’ ability to penetrate four major domestic markets in Asia. This is especially evident in the Chinese market that is dominated by local players. These are the findings of Springboard’s latest research covering Asia’s Service-Oriented Architecture (SOA) market, based on a survey of 354 CIOs and IT managers of large and mid-market enterprises in China, India, Singapore, and Australia.
“Local SIs and ISVs form an important part of the SOA ecosystem by integrating systems well, and by building customized applications on vendor platforms,” said Balaka Baruah Aggarwal, Senior Manager for Emerging Software for Springboard Research. “While international software vendors also offer integration and consulting services directly, ISV/SI partners are key providers of these services,” added Ms. Aggarwal.
The local ISV/SI partner landscape is very unique throughout most of Asia because up until now, many multinational vendors worked in the region with their top tier global integration partners. However, the Indian market is a notable exception where global IT companies such as IBM, HP and Microsoft dominate mindshare as SOA players, despite the presence of home-grown IT giants like TCS, Infosys, Wipro, HCL and Satyam.
“The Indian players are now beginning to expand both in the domestic market and neighbouring markets in the region. The case for Chinese integrators is just the opposite, as they have established their hold on the domestic market and are now on the prowl to expand their regional and global presence,” added Ms. Aggarwal.
Springboard has scanned the SOA partner landscape and identified some key vendors who are prominent in Asia. These SOA local leaders include:
- Kaz Group- Australia
- Kingdee- China
- TongTech- China
- Patni Systems- India
- Satyam Computers- India
- TCS- India
- Wipro- India
- TmaxSoft- Korea
- Samsung SDS- Korea
- NCS- Singapore
“Integration skills of partners have a critical role in successful SOA projects as SOA involves bringing together disparate IT systems,” Ms Aggarwal explained. “The battle for SOA has extended from simply marketing SOA solutions to seeking out partners who have good integration skills and reach in the local markets. Ultimately it is good partners who will make the difference in vendors’ ability to woo customers,” she added.
The study also found that price is not the number one reason for vendor selection. Important reasons for vendor selection are proven products and solutions, clearly defined roadmaps for deployment and vendor reputation. On the other hand, the perception of SOA being expensive emerged as the top inhibitor for SOA deployment.
“As SOA is a strategic initiative, the process requires investment and a long-term organizational commitment. Further, since business managers typically control the budget in an organization, particularly for extended strategic projects, vendors need to target business managers along with technology managers,” said Ms. Aggarwal.
About This Study
Service-oriented architecture (SOA) has been one of the IT industry’s hottest buzzwords over the past several years. IT vendors are evangelizing SOA and many organizations are looking at SOA to help them better integrate and leverage their existing and future software applications and infrastructures. SOA’s popularity lies in its promise to help organizations improve operations, cut costs, and boost efficiencies, while IT vendors see the technology as a way to tap into new revenue streams and acquire larger enterprise accounts. Springboard Research’s SOA Market Canvas is an ongoing research service that provides extensive SOA market coverage for the Asia Pacific region. The SOA Market Canvas examines key trends in the Asia Pacific SOA market and offers an array of SOA market data on an ongoing basis. Springboard’s Market Canvas service delivers a deeper level of research than other reports of its kind and assesses data from a granular level to help IT vendors formulate better SOA go-to-market plans.
About Springboard Research
Springboard Research is a next-generation IT market research and advisory firm. Springboard leverages its pioneering research model to deliver greater agility and flexibility in IT market research and helps its clients lead rather than follow market trends. Springboard works with the leading IT companies in the world in the software, services, and telecommunications & hardware sectors. Founded in 2004, Springboard has a worldwide presence with offices in the United States, Australia, Singapore and Japan, as well as global research centers in India, Pakistan, and Morocco. Springboard has been acknowledged as an emerging leader and was recently named ‘Rising Star’ in the global IT market research industry by Outsell, the leading research and advisory firm for the information industry. For more information, please visit http://www.springboardresearch.com/.
Posted in Analyst report, China, India, SOA, Software Vendor, System Integrator |
Written on Monday, March 10, 2008 by Gemini
LONDON: E-mails have joined the cigarette and the humble coffee runs as the latest threat to workplace productivity.
Researchers have carried out a study and found that e-mails have gone from being a useful office tool to a curse that actually takes up huge amounts of work time, 'The Daily Telegraph' reported on Monday. According to the researchers, the average employee now spends an estimated 90 minutes to two hours a day wading through hundreds of messages, much of which is basically spam and junk mail.
The study by the Radicati Group has found that worldwide email traffic has hit 196 billion messages a day. It is predicted to reach 374 billion per day by 2011.
"Employees're now so deluged with messages that emails have become a broken business tool in urgent need of fixing. There's been no innovation to separate the junk letters from the real ones," Jason Preston of the Parnassus Group, a social media consultancy, was quoted as saying.
A related study by another research firm Telewest Business recently found that emails and telephone habits could reduce productivity rather than increase it and "men are the biggest timewasters at work". According to the research, the misuse of telephones and emails at work was hindering office workers from doing their jobs, increasing bad habits at work and lengthening the working day.
Out Of 1,468 people questioned, the average time spent each day waiting for or chasing responses to urgent emails and on unnecessary emails was 42 minutes. An average of 27 minutes was wasted responding to voicemails or managing phone calls and 12 minutes was lost trying to locate colleagues, the study found.
Source: PTI. Monday, March 10, 2008
Posted in General research |
Written on Wednesday, March 05, 2008 by Gemini
Fair hostess Kerstin Koch-Weger displays a Toshiba M700 tablet PC at the CeBIT 2008 trade fair in Hanover on Monday. Depending on the model, the M700 series of laptops come equipped with a 2.2GHz Intel Core2Duo processor, 1-4GB of RAM, an Intel X3100 graphics accelerator, an 80-160GB hard drive, and a DVD-RW drive. Supporting USB 2.0, Bluetooth and Wi-Fi, the devices boast of security features such as fingerprint reader, multi-level password security and hotkey security. The M700 series retails at $1,449 (Rs 59,000) and upwards.
Posted in Gadgets |
No comments:
Post a Comment