Cure Your Errors

Fix Windows Errors & Optimize Your System

Windows/Mac OS/Linux
Support Guide


HT4623 I have a 3g iPhone and a new epson imprint compatible printer but the iPhone can't see it. I have downloaded all the software. Help

I have a 3g iPhone and a new epson imprint compatible printer but the iPhone can't see it. I have downloaded all the software.
But my phone does not recognises the printer. Can anyone Help? Thanks

to trouble shoot look here


1. Download & Run DLLEscort - Download Now

2. Click 'Start Scan' to analyze your System.

3. Click 'Fix Errors' and you're done!


Related Content

HZ_CUST_ACCOUNT_SITE_V2PUB.update_cust_site_use fails with no customer

Hyundai LCD monitor won't work with OS9

Hyphenation not working for Russian language

Hyphenation & text-align:justify in PDF's with CFDOCUMENT

Hypervisor: Create Standalone Real-Time Application (Error at deploying: File not found)

Hypertext links are not always preserved from Word to PDF, using Aperçu or Adobe, depending on OS 10 or Lion. Why? This generally works perfectly in Windows. Why are Apple and Adobe unable to correctly handle links when converting from Word to PDF?

HyperTerminal sees a COM port but VISA does not

Hyperlinks, how can I zoom in on part of a page on a pdf.

Hyperlinks wont open .pdf  on local machine

Hyperlinks to XLS, DOC, and PDF files that are included in a .chm file work intermittently.

Hyperlinks to new window in CF Report Builder

Hyperlinks on master pages not working after distilling (InDesign CS4)

Hyperlinks not working properly in PDF, converted by Nuance PDF converter

Hyperlinks not working in PDF from ID

Hyperlinks not working in PDF created from Word

Hyperlinks not working in PDf created from Keynote '08

Hyperlinks not working in cc 2014

Hyperlinks not working in adobe acrobat 7.0 browser control type library 1.0

Hyperlinks not working correctly when converting word doc to Web Page, Filtered

Hyperlinks not saved correctly in PDF and Excel

Hello all,
I'm trying to connect to a webservice that runs on a HTTPS address. And I'm in a proxy + firewall.
The first thing I tried was the suggested on weblogic documentation:
Proxy p = new Proxy(Proxy.Type.HTTP, new InetSocketAddress(proxyHost, Integer.parseInt(proxyPort)));               
HttpTransportInfo info = new HttpTransportInfo();
And I got the exception: Unable to tunnel through proxy. Proxy returns "HTTP/1.0 407 Proxy Authentication Required"
I tryed some variations like use HttpsTransportInfo instead of HttpTransportInfo, and set the proxy data through system properties, but everything give me the same error.
There are any other ways to make a authentication with the proxy? like configure something in the virtual machine or anything like that?

Hello experts.
I'm facing the following problem with XI IDoc to IDoc scenario.
The problem is that XML messages appear in SXMB_MONI transaction with an error status, the status is
The error looks as following:
<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
<!--  Call Adapter  -->
<SAP:Error xmlns:SAP="" xmlns:SOAP="" SOAP:mustUnderstand="">
     <SAP:P2>Not Found</SAP:P2>
     <SAP:P3 />
     <SAP:P4 />
     <SAP:AdditionalText>..{some text here} ..</SAP:AdditionalText>
     <SAP:ApplicationFaultMessage namespace="" />
  <SAP:Stack>HTTP response contains status code 404 with the description Not Found Error when  sending by HTTP (error code: 404, error text: Not Found)</SAP:Stack>
I have an assumption that the problem is in the HTTP port of requested URL. because of the following:
<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
<!--  Call Adapter   -->
<SAP:OutboundBinding xmlns:SAP="" xmlns:SOAP="">
<SAP:OutboundBindingEntry version="30">
  <SAP:FromPartyName />
  <SAP:ToPartyName />
  <SAP:AdapterTypeData xmlns:SAP="" />
  <SAP:FieldMapping xmlns:SAP="" />
  <SAP:ChannelEntry version="30">
  <SAP:PartyName xmlns:SAP="" />
  <SAP:ServiceName xmlns:SAP="">SXI</SAP:ServiceName>
  <SAP:ChannelName xmlns:SAP="">IDOC</SAP:ChannelName>
  <SAP:AdapterName xmlns:SAP="">XI</SAP:AdapterName>
  <SAP:AdapterNamespace xmlns:SAP=""></SAP:AdapterNamespace>
  <SAP:AdapterSWCV xmlns:SAP="">3B787A8035C111D6BBE0EFE50A1145A5</SAP:AdapterSWCV>
  <SAP:AdapterEngineType xmlns:SAP="">IS</SAP:AdapterEngineType>
  <SAP:AdapterEngineName xmlns:SAP="" />
  <SAP:MessageProtocol xmlns:SAP="">XI</SAP:MessageProtocol>
  <SAP:MessageProtocolVersion xmlns:SAP="">3.0</SAP:MessageProtocolVersion>
  <SAP:TransportProtocol xmlns:SAP="">HTTP</SAP:TransportProtocol>
  <SAP:TransportProtocolVersion xmlns:SAP="">1.0</SAP:TransportProtocolVersion>
  <SAP:ChannelDirection xmlns:SAP="">O</SAP:ChannelDirection>
  <SAP:FromPartyAgency xmlns:SAP="" />
  <SAP:FromPartySchema xmlns:SAP="" />
  <SAP:ToPartySchema xmlns:SAP="" />
  <SAP:ToPartyAgency xmlns:SAP="" />
  <SAP:ChannelAttributes xmlns:SAP="">
  <SAP:AdapterTypeData xmlns:SAP="">
  <SAP:Value />
  <SAP:Value isPassword="true" />
  <SAP:Value> 50000 </SAP:Value>
If I am no mistaken
  <SAP:Value> 50000 </SAP:Value>
this attribute is setting the port of requested URL, it is 50000, but it should be, I guess, 8000.

Since moving from SP9 to SP12, all of our communication channels using a receiver file adapter get an HTTP error. One specific scenario is we read a file from UNIX (this works) and then remap and write a file back to the same directory (this gets the HTTP error).
receiver adapter config is as follows:
Adapter type = File
Transport Protocol = File System (NFS)
Message Protocol = File Content Conversion
Adapter Engine = Integration Server
Target Directory = /sap_int/XI_railmax
File Name Scheme = new_sapack.txt
A sender agreement has been created (and worked prior to the upgrade to SP12)
Main Trace log:
<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
- <!--  Call Adapter
- <SAP:Main xmlns:SAP="" xmlns:SOAP="" xmlns:wsu="" versionMajor="003" versionMinor="000" SOAP:mustUnderstand="1" wsu:Id="wsuid-main-92ABE13F5C59AB7FE10000000A1551F7">
- <SAP:Sender>
  <SAP:Interface namespace="">MI_BOL_RemapFromFile</SAP:Interface>
- <SAP:Receiver>
  <SAP:Party agency="" scheme="" />
  <SAP:Interface namespace="">MI_BOL_RemapToFile</SAP:Interface>
  <SAP:Interface namespace="">MI_BOL_RemapToFile</SAP:Interface>
<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
- <!--  Call Adapter
- <SAP:Error xmlns:SAP="" xmlns:SOAP="" SOAP:mustUnderstand="">
  <SAP:P2>Not Found</SAP:P2>
  <SAP:P3 />
  <SAP:P4 />
  <SAP:AdditionalText><!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN"> <html> <head> <title>Error Report</title> <style> td {font-family : Arial, Tahoma, Helvetica, sans-serif; font-size : 14px;} A:link A:visited A:active </style> </head> <body marginwidth="0" marginheight="0" leftmargin="0" topmargin="0" rightmargin="0"> <table width="100%" cellspacing="0" cellpadding="0" border="0" align="left" height="75"> <tr bgcolor="#FFFFFF"> <td align="left" colspan="2" height="48"><font face="Arial, Verdana, Helvetica" size="4" color="#666666"><b>  404 &nbsp Not Found</b></font></td> </tr> <tr bgcolor="#3F73A3"> <td height="23" width="84"><img width=1 height=1 border=0 alt=""></td> <td height="23"><img width=1 height=1 border=0 alt=""></td> <td align="right" height="23"><font face="Arial, Verdana, Helvetica" size="2" color="#FFFFFF"><b>SAP J2EE Engine/6.40 </b></font></td> </tr> <tr bgcolor="#9DCDFD"> <td height="4" colspan="3"><img width=1 height=1 border=0 alt=""></td> </tr> </table> <br><br><br><br><br><br> <p><font face="Arial, Verdana, Helvetica" size="3" color="#000000"><b>  The request can&#39;t be processed.</b></font></p> <p><font face="Arial, Verdana, Helvetica" size="2" color="#000000"><table><tr><td valign="top"><b> Details:</b></td><td valign="top"><PRE>Requested resource &#40; MessagingSystem/servlet/MessagingServlet &#41; not found.</PRE></font></td></tr></table></font></p> </body> </html></SAP:AdditionalText>
  <SAP:ApplicationFaultMessage namespace="" />
  <SAP:Stack>HTTP response contains status code 404 with the description Not Found XML tag Envelope missing in SOAP message header (SAP XI Extension)</SAP:Stack>
NOTE - this is only an issue when writing a FILE.  IDOC's inbound and outbound still work, the JDBC adapter still works, reading a file still works....just a problem with a 'receiver' File Adapter....any ideas?
thanks /Dave

I am seeing a small percentage of my customers who cannot connect to my server through my Java program to download program updates. The commonality between the customers is that all (or most) of them seem to be from Australia, but I have some customers in Australia who CAN connect successfully. Probably a red herring.
The problem is seen when creating a URL connection to a text file to read it. The file contains only 3 lines. Like I said, most users do not have a problem, but some do. Could it be that the connection is actually timing out, and if so, how do I lengthen the timeout? Most users say that they do not even have to wait 5 seconds before the error occurs.
The user can connect to the URL with a browser and download the files that way, but this is supposed to be an automatic process. They tell me there is nothing wrong with their network. I'm wondering whether they might have a firewall in place, but when I block connections using my software firewall, I get instead.
The code looks roughly like this. I've added printlns recently and have not heard back from the customer with any results.
        String urlString = cUpdateLocation + cUpdateFileName;
         try {
            System.out.println("Create the URL: " + urlString);
            URL url = new URL(urlString);
            System.out.println("Open the URL connection");
            URLConnection urlConn = url.openConnection();
           System.out.println("Get an input stream");
            InputStream inputStream = urlConn.getInputStream();
            System.out.println("Define an input stream reader");
           InputStreamReader iReader = new InputStreamReader(inputStream);
           System.out.println("Define a buffered reader");
            BufferedReader bReader = new BufferedReader(iReader);
           String input = "";
           System.out.println("Read from the buffered reader");
            try {
                while (input != null) {
                   input = bReader.readLine();
                    if (input != null) {
                        // Ignore blank lines 
                        if (input.length() > 0) {
                            System.out.println("input=" + input);
               System.out.println("Done reading from the buffered reader");
                success = true;
            } catch (IOException e) {
               // Catch these IOExceptions here so we can still close the
                // streams and clean up while we have reference to them
.                System.out.println("IOException encountered reading from:\n" +
                                  urlString + "\n" + e);
        } catch (MalformedURLException e) {
            System.out.println("URL " + urlString + " is malformed.  " +
        } catch (IOException e) {
           // This is the exception that I normally see.
            System.out.println("IOException encountered creating stream to " +
                               "read from:\n" + urlString + "\n" + e);

I posted this in the Conventional & Interruptable IO forum, but thought it might be better off here....
I'm having a strange issue with this piece of code. The input stream coming in is quite predictable and I know that this loop will only iterate twice. Once for less than 1024 bytes, and the next returns -1. The problem I'm having has to do with the second iteration. The; line takes around 75 seconds just to get that -1 so we can break out of the loop. I don't understand how the first read is so fast, and the second one is so slow. Any thoughts?
1 URL url = new URL("http://xyz");
2 HttpURLConnection con = (HttpURLConnection)url.openConnection();
3 DataInputStream in = new DataInputStream( con.getInputStream() );
4 StringBuffer httpResponse = new StringBuffer();
5 byte[] bary = new byte[1024];
6 while (true) {
7 int bytesRead =;
8 if (bytesRead <= 0) {
9 break;
10 }
11 httpResponse.append(new String(bary, 0, bytesRead));
10 }//end while
I also want to mention that this bit of code runs in J2EE apps that are deployed to 5 different environments, one of those being production which is heavly hit. In only one of these am I having the issue...not production :) . They're running on Solaris 8, sun's sdk 1.4.2_3, weblogic 8.1. Also, I've run the exact same code making this call and I know that the app on the other end is responding fast...we've also verified this using the webserver's access log.
Thanks for reading,

the exception we had from weblogic_10.3.5 to successfully pass internet proxy authentication seems not working.
here you go,
and still the same exception occurs, any ideas ?

I am calling an HTTPService (id=data2php) which runs a php
script which creates an xml doc and puts it on my server. i then
call another HTTPService (id=feedRequest) which reads the xml doc.
i then try to populate a datagrid with the lastResult of my
HTTPService that reads the xml doc. My issue is that that the
HTTPService which reads the xml doc reads the previous incarnation
of the xml doc, not the newly created one that my HTTPService
(data2php) created. i've been manually going to the xml doc in my
browser and hitting refresh for my feedRequest HTTPService to read
the updated xml.
here is the mxml for the httpservice that reads the xml:
<mx:HTTPService id="feedRequest" url=""
result="feedResult(event)" resultFormat="xml" useProxy="false"/>
pretty straight forward.
(actually, once in a while it will read the updated

Hi, I'm trying to write a java client that will achieve the same as the following curl command:
/usr/bin/curl -k --cert /tmp/x509up_u10002 https://mysecure_server
The cert specified is obtained from a myproxy server (
I have written a java client that works fine if its given a p12 certificate and password but I meed it to work with the passwordless cert obtained from the myproxy server.
I believe that the cert /tmp/x509up_u10002 is in pem format.
All my attempts result in errors such as : No trusted certificate found
(this is despite having the ca cert in the truststore AND an 'all-trusting' TrustManager)
or Received fatal alert: handshake_failure
or even toDerInputStream rejects tag type 45 ?????
I'm totally new to this security stuff and would very much appreciate any help or pointers.
Will the java security mechanisms support such a cert without a password?
Many Thanks

its my old post i made to another Firefox forum Topic, it was off topic there & there i asked how to solve the issue
why docent https-everywhere update? & from the site it won't let me install the new version neither, i have 4.0.2, & new version is 4.0.3, from here, its official site
i was reading from here about it is it bad idea to use it if it has issues with Mozilla? should i replace it with add that is mentioned in that Mozilla topic?
got new version a different way, but is it bad i wonder?
should i replace it with something? & with what?
anymore suggestions for its replacement? or is this link above best?
i heard it causes problems to some, what kind of problems? like when writing comment suddenly all freezes for few sec so u cant write? & page loading lag where loading freezes for few sec sometimes? are this issues caused by https-everywhere
Off The Topic questions:
BDW in profile folder Addblock folder still exists, why? when i removed it long time ago, does ABE use addblock files cause there were blocked sites on there that ABE did not have when i installed ABE?
i want to know, what is NVIDIA 3D VISION & what it does, & should i always enable it on youtube, or should i switch it to never ask from plugins?
should i disconnect after sync? can connection to sync server cause some issues to, like slight lag after deleting a bookmark? can go to another opened tab, but cant open 1 of the bookmarked pages from bookmarks for few sec after deleting a bookmark, or is there issue with More In Content UI + 2.2

Hello to you all, if you are able to help with a DNS problem that I'm having then please accept my thanks and appreciation in advance.
First some background information, I recently  moved my server from my studio to my house where a new purpose built studio will soon be erected. At my old studio any requests for came in via the IP (whether that be http, https, ftp etc) from the domain registrar and the router would send the request to the relevant port number whether that be 80 for http or 443 for https etc and all was well as this location had a fixed IP address. Unfortunately at my new location whilst I have a much faster connection I do not have a fixed IP. To get around this I have the following set up (not ideal for a business I know but perfectly OK for home hosting); I set up two psuedo nameservers at ( and which tracks the changes in my IP address and updates its records accordingly, my registrar then sends any requests to these 'nameservers' and no-ip then forwards it on to my server. So far so good.
The problem arises once the requests get to my server, whilst I have DNS set up, I can only recieve requests from a straight request to the server ie will display the site without any problem, but if I then put a www in front of that or try to access the https part of my site (which is set up as a seperate site on the same server) then the server throws an error. I have tried to put an alias (CNAME) into the zone but it does not want to resolve the request. I have searched around but to no avail, I am totally new to DNS so am currently on a steep learning curve and fumbling around in the dark.
The first thing that I need to get working is the request to be resolved correctly and then (and this is where the real fun starts!) is to dynamically update the IP in the DNS records as the IP changes. I will probably have to get help in on this as I understand that this requires BIND of which I know nothing about, first though I'd like to get the pages to be served up correctly. Advice, hints, tips or links to tutorials all greatly appreciated. Full set up listed below.
Many thanks, David.
Xserve PPC G5 running 10.5.8 unlimited set up as standalone OD master
CradlePoint MBR1200 Gateway router which acts as the DHCP and set up as 2 seperate sites and located on the Xraid
Current DNS setup:
Primary Zone name: with nameservers and and allow zone transfers in checked
Primary Zone
        Machine (external IP)
        Machine (external IP)
        Machine (external IP)
With the reverse zone looking thus with allow zone transfers being checked
Reverse Zone
        Reverse mapping