Rabu, 27 Mei 2009

Mau Mencari Blogger Satu Kota atau Satu Daerah ? Nih Tipsnya !

Ada yang kangen teman lama ? atau mungkin mau mencari blog-blog teman ? atau mungkin aja hanya ingin tahu blogger yang berada satu kota dengan lo ? Kalau nggak, mau lihat blogger-blogger dari kota lain ?
Nih ada tipsnya !

Kalau mau cari blogger seluruh Indonesia gini nih :
http://www.blogger.com/profile-find.g?t=l&loc0=ID

Dan kalau blogger dalam satu provinsi :
http://www.blogger.com/profile-find.g?t=l&loc0=ID&loc1=Jawa+Barat

Terakhir, blogger dalam satu kota/ kabupaten :
http://www.blogger.com/profile-find.g?t=l&loc0=ID&loc1=Jawa+Barat&loc2=Bandung

Keterangan : ganti aja huruf yang berwarna hijau sesuai dengan kode negara, nama provinsi dan nama kota yang ingin lo cari...
ID : kode untuk Indonesia nih !
Jawa+Barat : nama provinsi. Tanda + berfungsi sebagai spasi pada nama provinsi dengan dua kata atau lebih.
Bandung : nama kota/kabupaten. .

Share, share !

Read more.....

Selasa, 26 Mei 2009

Bugs hit Facebook application verification program


Facebook's Application Verification Program, controversial due to its concept of charging developers to have their applications certified as "trustworthy," has run into technical problems.

Announced in November and launched on Wednesday, the program has system bugs that are preventing developers from reaping some benefits of having paid to have their applications reviewed and approved.

In a thread on the official Facebook developer forum, developers who shelled out the $375 review fee began reporting a variety of system problems on Wednesday.

In that same thread, Facebook on Thursday afternoon acknowledged that at least three of the bugs reported exist and that the company is working to fix them.


For example, the special green checkmark that denotes verified applications' special status isn't appearing in the Applications Directory search results. Consequently, without that special badge, the applications look no different from those posted by developers who didn't pay for the verification.

In addition, some developers are reporting that they can't submit their applications for review because the link to do so doesn't work, another bug Facebook has acknowledged exists for some applications.

Another bug Facebook has acknowledged is that the boost in user notifications and requests that verified applications get isn't always showing up in the developer's control panel stats.

Other developers complained in the thread that they couldn't find their applications at all-- green checkmark or not -- although this may be due to the way the Facebook algorithms work in displaying certain applications to certain people and not others.

The program became instantly controversial when it was announced in November because critics said developers shouldn't have to pay to have their applications labeled "trustworthy." They argued that it should be up to Facebook to ensure that applications built for its site comply with this requirement.

In response, Facebook has said that, in fact, all of the more than 52,000 applications on its platform must comply with requirements and policies that make them trustworthy. The Application Verification Program, which is optional, gives developers a chance to make their applications stand out by adopting an additional set of best practices for them regarding user experience and user communications, according to Facebook.

Still, some Facebook developers remain unconvinced about the value of the program, and even more so now with the technical issues affecting it.

"I will not pay to be approved. It's not worth the money. Any good application will do just fine without it," said Christopher Bourton, games developer and consultant at Lethos Designs in London, which has developed three Facebook applications and is building two more.

Bourton, contacted via e-mail on Thursday, said he fears that the program will create "an elitist two-tier system" in which large developers that can pay the fee will get the benefits, while smaller developers with fewer resources will not be able to afford it.

Applications approved through the system get the verified status for 12 months, after which developers must re-submit them for review and pay the $375 fee again.

Other developers are more positive about the program, like Tim O'Shaughnessy, CEO of LivingSocial, which has created about 10 applications for Facebook, including its very popular namesake and Visual Bookshelf.

"The verification program is a nice way of allowing users to weed through the noise and know [that] if they're adding a [verified] application, there is a sense of trust behind that add," O'Shaughnessy said via e-mail.

Living Social submitted Visual Bookshelf for verification and got it approved, but while the process was fairly simple and straightforward, it took Facebook longer than O'Shaughnessy expected to complete its review. "Now that the initial applicants have been verified, however, my guess is the process will be much shorter in the future," he said.

For O'Shaughnessy, a big question regarding the value of the program is whether it will truly give Facebook users a sense of security towards verified applications. He also hopes that Facebook will continue to evolve the program.

"As new Facebook features and functionality are made available, will the verification program keep up with new, relevant additions? This seems like a necessity in order for the program to have long-term value," he said.

Gartner analyst Ray Valdes thinks that establishing the program was a good move by Facebook. "Facebook's value proposition is having a quality user experience and that includes the experience of applications," Valdes said in a phone interview. "As the number of applications has grown, the quality of the experience has decreased. This is part of their ongoing maintenance and cultivation of the user experience."

IDC analyst Al Hilwa concurs that end users will benefit from having a set of applications that Facebook has certified as meeting special criteria for user experience and trust. "I think this is a welcome move to rein in what could potentially be a tiring process of finding well-behaved and trustworthy apps," he said via e-mail.

"Relying on market forces to sort out the wheat from the chaff may work in the long run and sounds good as an ideal, but with the velocity of business these days, and the ephemeral stickiness of online sites, it is maybe too late for a platform to be successful to wait for that process to settle down," Hilwa added.

On Wednesday, Facebook launched the program with an initial set of 120 verified applications, but it expects developer interest to pick up considerably now that the program has been launched.

Read more.....

Minggu, 24 Mei 2009

MySQL, Forked beyond repair?


One month after Oracle announced its takeover of Sun Microsystems, the future of MySQL remains up in the air. Can the leading lightweight open source database still thrive when it's controlled by the leading proprietary commercial database vendor? So far, the prognosis doesn't look good.

Even before the Oracle buyout, there were signs of strain within the MySQL community. Not long after Sun acquired MySQL in 2008, key MySQL employees began exiting the company, including CEO MÃ¥rten Mickos and cofounder Monty Widenius. Widenius, in particular, was vocally critical of the MySQL development process under Sun's stewardship, citing rushed release cycles and poor quality control. Another MySQL cofounder, David Axmark, left out of frustration with the bureaucracy and tedium of Sun's buttoned-down corporate culture.


In the wake of this exodus came another ominous development: Forks of the MySQL codebase began to appear, including Drizzle and MariaDB, offering users and contributors ways around Sun's control of the main branch. Drizzle is an attempt to shed some of the feature bloat that has crept into recent MySQL releases, in favor of a lightweight database server aimed at cloud computing and Web applications. MariaDB, on the other hand, aims to be feature-compatible with MySQL, but it uses a brand-new, transaction-capable storage engine by default. And perhaps even more significantly, MariaDB is spearheaded by none other than Widenius himself.

If that wasn't troubling enough for MySQL's new minders at Oracle, Widenius has since dropped the other shoe. Last week, he announced the formation of the Open Database Alliance, a "vendor-neutral consortium" whose stated aim is to become "the industry hub for the MySQL open source database, including MySQL and derivative code, binaries, training, support, and other enhancements for the MySQL community and partner ecosystem." Notably, no one from Oracle is listed among the Open Database Alliance's contacts.

If all this leaves you scratching your head, you're not alone. In March, former MySQL employee and Drizzle developer Patrick Galbraith wondered aloud just which branch of MySQL should be considered "official" these days. The ultimate answer to that question will determine the fate of the MySQL database.

Can Oracle keep the MySQL product relevant?
Of course, there can only be one real, official version of MySQL: It's the one that was originally developed by MySQL, was later bought by Sun, and was finally acquired by Oracle. Oracle now owns all the copyrights, trademarks, and other intellectual property associated with the MySQL name -- and that intellectual property has always been defended vigorously. MySQL had even been known to send trademark-violation notices to partners that had the temerity to call their service offerings "MySQL support" instead of "support for MySQL databases."

While that's all well and good, however, the MySQL brand alone isn't likely to comfort customers who worry that an open source database won't get the attention it needs from one of the world's largest commercial software companies. Already some customers must be questioning Oracle's commitment to a low-end product like MySQL, when it has a lucrative, proprietary database to sell. And as the MySQL community fragments and begins turning to alternatives, the economics of Oracle's MySQL business become steadily less attractive.

But if MySQL's approval ratings are slumping, all the more reason for Oracle to move decisively. Oracle must work to regain the trust and support of the MySQL community or risk losing mindshare to a fork, such as Drizzle or MariaDB. To do that, it has to avoid making the mistakes that Sun made when it acquired MySQL. In a sense, to succeed with MySQL, Oracle will have to stop acting like Oracle.

Open source customers are a notoriously fickle bunch. If one project doesn't deliver what users need, the users go elsewhere -- and the same goes for developers. Forks are a fact of life in the open source community, and arguably an entirely healthy one. Oracle just better hope it doesn't end up on the wrong side of the fork.

When projects fork, there are winners and losers
Coincidentally, a similar drama is playing out elsewhere in the open source world right now. This one concerns glibc, the Gnu standard C library that is used by practically every piece of software running on Linux systems. Earlier this month, the Debian Project opted to switch its entire distribution from using standard glibc to a fork called eglibc. Ostensibly the new fork works better for embedded systems programming, but the community scuttlebutt says the switch was really the result of ongoing problems with glibc's notoriously obstinate maintainer, Ulrich Drepper. (One contributor even went so far as to file a bug report describing the problem.)

The name eglibc is surely no accident. It echoes an earlier, contentious incident in which a group of developers working on gcc, the Gnu C compiler, frustrated by the project's restrictive contribution model, split off to form a new fork called egcs. Freed from excessive bureaucracy, the fork thrived, while development on the main gcc branch stagnated. Eventually it died off completely; all that was left was for egcs to formally change its name back to gcc. The fork became the main branch -- and according to some egcs developers, that had been their intention from the very beginning. It seems likely that the eglibc maintainers have something similar in mind.

There's a real lesson to be learned here for Oracle and other maintainers of open source projects. Creeping authoritarianism in the development and contribution processes is something that many users of open source software are simply unwilling to tolerate; quite naturally, the contributors with the most to offer are the most likely to become frustrated when they feel stymied by red tape (or simple pigheadedness). Projects that are maintained by commercial entities are particularly susceptible to this tendency. Twelve years after Eric S. Raymond published his landmark essay, "The Cathedral and the Bazaar," too many projects -- and companies in particular -- can't seem to let go of their cathedral mentalities.

That's why the best course of action for Oracle would be to join the Open Database Alliance and take an active role in the ongoing development of MySQL in an open, community-driven way. Oracle acquired MySQL as an asset of Sun, but Sun never knew what it had in the first place, nor how to manage it. If Oracle can't figure out how to do what Sun couldn't, it will still own the MySQL name; unfortunately, however, that name won't mean much.

Read more.....

Sabtu, 23 Mei 2009

VMware vSphere 4, The once and future virtualization king


VMware vSphere 4, out today, is a big release, with plenty of new features and changes, but it's not your run-of-the-mill major update. The new features, which range from VM clustering to agentless VM backup, are especially significant in that they may mark the moment when virtualization shifted from the effort to provide a stable replica of a traditional infrastructure to significantly enhancing the capabilities of a virtual environment.

In short, if you're running a VMware infrastructure, life should get easier. For anyone who's ever tried to provide rock-solid OS-based clustering services, the new VM clustering feature, called Fault Tolerance, should be a vast improvement. Hot Add of CPUs and RAM has never really been an option for most shops, but it suddenly is (with the right OS, of course). These moves show that VMware is still pushing the virtualization envelope.

Considering the scope of the upgrade, perhaps "VMware Infrastructure" did warrant a new name, but let's hope that VMware stops there. The company has a bad habit of changing the names of its products every few months, and it's getting tiresome trying to explain why VirtualCenter, vCenter, VI3, V3i, ESX, ESXi, and now vSphere are all basically the same product or parts of the same product suite.

Along with new features and improvements, vSphere brings more hardware resources to VMs. You can now add up to eight vCPUs to a single VM; previously, VMs were limited to four. The new RAM limit is 255GB, up from 64GB. The ESX hosts themselves can now support up to 64 cores and 512GB of RAM. Also -- though I haven't had a chance to test this -- it appears that you can map raw PCI devices to a specific VM.

VMware's also making some noise about performance enhancement for key technologies, such as claims of 20 percent performance improvement in Microsoft SQL Server throughput, and a claim of a 10x performance bump for iSCSI. That last claim may be just a bit exaggerated, as it appears to be based on the support of 10Gig iSCSI interfaces, rather than an improvement in VMware's internal iSCSI software initiator, which has always been a bit sluggish.

Speaking of performance, the performance graphs and data available in vSphere is much improved over the current release, with a more intuitive layout and better overall access to specific information regarding the performance of a VM or a host.

Read more.....

Kamis, 21 Mei 2009

Microsoft, HP expand unified communications push

Microsoft and Hewlett-Packard have expanded their partnership to develop and sell a common platform for delivering voice, video, and messaging services to office workers.

The companies plan to spend $180 million over the next four years on developing products and services for unified communications, and on sales and marketing for those products, said Meg Shea-Chiles, worldwide director for HP's partnership efforts with Microsoft.


The work will include development around HP's ProCurve networking products and Microsoft's Office Communications Server, Office SharePoint Server and Exchange products. HP will also certify its TouchSmart Business PCs and some smartphones for Microsoft's communications software, as well as some new IP desk phones that HP plans to develop.

Unified communications lets workers do things such as listen to a voice-mail from within an e-mail program and make calls from an IM client. It includes presence software to show when people are online and what communications systems they have access to at that moment, as well as software that aims to simplify management on the back end.

Vendors such as Microsoft and Cisco are pushing unified communications as a way to make office workers more productive while keeping costs down. Many have latched onto the recession as a good time to promote these products, especially when companies are cutting back on travel and in-person meetings.

Microsoft and HP have an existing unified communications partnership that goes back several years. In 2006 they said they would deliver unified communications systems using Microsoft software and HP's blade servers, storage gear, and professional services.

HP also partners with Cisco for unified communications. Its ProCurve gear has made the two companies compete more directly, but Shea-Chiles said HP's Cisco partnership continues unchanged.

She wouldn't say how many joint unified communications customers Microsoft and HP have today, but she said the investment they plan to make is "significantly" more than it has been. She also wouldn't say how much of the money they plan to invest is for product development and how much for sales and marketing.

The joint work will also involve adding greater support for Microsoft's Office Communications Server in HP's Business Technology Optimization software, including the ability to provide real-time quality-of-service metrics for IP-based voice and video traffic. OCS users will also be able to join telepresence meetings conducted with HP's Halo system.

Read more.....

Selasa, 19 Mei 2009

New Windows netbooks may harbor malware

After discovering attack code on a brand new Windows XP netbook, antivirus vendor Kaspersky Labs warned users yesterday that they should scan virgin systems for malware before connecting them to the Internet.

When Kaspersky developers installed their recently-released Security for Ultra Portables on an M&A Companion Touch netbook purchased for testing, "they thought something strange was going on," said Roel Schouwenberg, a senior antivirus researcher with the Moscow-based firm. Schouwenberg scanned the machine -- a $499 netbook designed for the school market -- and found three pieces of malware.


"This was done at the factory," said Schouwenberg. "It was completely brand new, still in its packaging."

With a little more digging, Schouwenberg found multiple Windows system restore points, typically an indication that the machine had been updated with new drivers or software had been installed before it left the factory. One of the restore points, stamped with a February date, included the malware, indicating that it had been put on the machine before then. And the malware itself hinted how the netbook had been infected.

"In February, the manufacturer was busy installing some drivers for an Intel product in the netbook," said Schouwenberg, citing the restore point. Among the three pieces of malware was a variant of the AutoRun worm, which spreads via infected USB flash drives.

"The USB stick they used to install the drivers onto the machine was infected, and [it] then infected the machine," said Schouwenberg. Installed along with the worm was a rootkit and a password stealer that harvests log-in credentials for online games such as World of Warcraft.

Kaspersky has reported its findings to M&A, said Schouwenberg, but the netbook maker has not been in contact with the security company since then.

Although factory-installed malware is rarely found on consumer electronics, there have been cases. Last December, for example, Amazon.com told customers it had sold Samsung digital photo frames before the holidays that came with a driver installation CD infected with a Trojan downloader. "These [cases involving computers] are much rarer than picture frames," said Schouwenberg.

To ensure that a new PC is malware-free, Schouwenberg recommended that before users connect the machine to the Internet, they install security software, update it by retrieving the latest definition file on another computer and transferring that update to the new system, then running a full antivirus scan.

"That's the best course of action, even though it sounds like a lot of work," said Schouwenberg.

Read more.....

Jumat, 01 Mei 2009

Why More Megapixels Don't Make Better Pix ?


When it comes to electronics, more is better. Consumers want more features, more hard-drive space, more cellphone minutes and more battery life.

But with digital cameras, it's not that simple. Many stores will tell you that the worth of a camera is measured in megapixels. The more manufacturers can pack in, the better, right?

Not necessarily, says Amit Gupta, founder of Photojojo.com, an online newsletter for camera tips and projects.

A high-megapixel count doesn't always equate to better image quality. Actually, if camera designers try to cram too many megapixels into a small camera, it can have the opposite effect.

Such a counterintuitive snag mostly affects tiny digital cameras, the ones compact enough to fit in your pocket.

To keep sizes down, manufacturers place itty-bitty image sensors inside their point-and-shoot models. These small parts perform well within a certain range. But when companies try to raise the megapixel count without increasing the dimensions of the camera, the same size sensor now has to do more work.

The result are larger but less accurate images, Gupta says. The overburdened sensor can lose sharpness, struggle in low-light situations and add "noise" (small blotches or odd colors).

Digital SLR cameras are bulkier than sleek point-and-shoots, but the extra room allows for much bigger sensors and often better image quality per megapixel.

Cameras are rarely advertised on their sensor sizes, which makes the warning difficult to act on. But the problem usually pops up when companies release two very similar models, one with more megapixels and, most likely, a higher price. In those situations, the extra few hundred dollars doesn't necessarily buy you a better camera.

Sensor technology improves all the time, making the issue of cramped megapixels less important each year. Improved lenses and anti-shake features also dampen the effect.

But even if companies could make a flawless 18-megapixel camera the size of a deck of cards, few people will ever need that much, Gupta says.

Read more.....

Search