Hey guys, mild programmer here. I've got a design question for you. I've made a mySQL/PHP site for the purpose of displaying the comics I make :) It works pretty well, but now I'm looking to refine my interface with it. One of my problems is with the primary key; currently I have it defined to auto_increment. The annoying thing about this is if I delete a comic and add a new one, the primary key will number it after the old one, not after the last one on the table, so that there will be a gap in the numbering. My code currently relies on the primary keys to be continuous... I know in an ideal world I'd program to handle gaps but anyway, this is just a personal site XD And ideally I would like the primary keys to be continuous even if I did have it programed to handle gaps, just because I'm a bit OC like that :P Anyway, I'm wondering how to make the tables easier to edit. Should/can I take out the auto_increment command and just assign a key 1+ the last key in the table via PHP, when I upload a new comic? Should I do something complicated like load the edited info to a temp new table, destroy and recreate the old table, and repopulate it with the new list each time I delete a row? Or is there a command that will reset the auto_increment feature to the last primary key (or to a number of my choosing)? Thanks!
EDIT: on looking around online more, it seems the common advice is to drop then readd the ID column, which will automatically renumber the rows. Seems a little sketchy, but if it works I guess it'd do. Thoughts?
The following has been providing a bit of a problem for me over the past few days. I have an initial display page with the HTML form and text fields, however the number of text fields can be controlled by the user. As such they are all placed into an array within the $_POST array, to be retrieved later. I've tried doing this, and this alone works fine. The problem is when I try to loop through them to insert them into the database, for some reason while they will display through echo fine, only the first one will insert into the database, then throw an error. Below is a loose sample of the code.
The text fields in the HTML file are named as follows : $name[]
The code in the PHP file is as follows, excluding anything not necessary for the example :
$conn = db_connect(); // establishes a database connection if (!conn) { echo "database error"; exit; }
if anyone can spot something I'm overlooking please let me know, I'm still at a loss for why it will only insert into the query on the first pass through the loop, however if I'm just echoing the values it will work fine.
i have this function the reads a csv file (it's actually a pipe-separated file...), exported from access and dumped into mysql. after some formatting and setting things up, i start a while... loop, reading each line of the file and updating or inserting into the table as necessary. my problem is that after the while loop, the script seems to just die. everything is updated/inserted correctly, but it just stops after the loop. the "report" code never happens. the stuff that executes outside of the function after i call it never happens. can anyone see anything wrong with this? i can post the entire function if you'd like.
There are 140 lines in stats.txt, fields are terminated by a tab, lines are terminated by a new line. I get 140 new entries, but "id" column (auto_increment) is "0":
Okay, here's one that's a bit beyond my SQL Server knowledge.
I need to generate a random number from 1 to n (where n is 2, 3, or 4) for a column being inserted into a table from a select from a temporary table.
INSERT INTO QA_Exam_QuestionsDetail ( EmployeeID, ExamID, QuestionID, AnswerID, OriginalAnswerID ) SELECT @EmployeeID, @examID, QuestionID, Round((@Upper * Rand() + 1), 0), B.AnswerID FROM #tempSelectedQuestionTable A, QA_Answers B WHERE A.QuestionID = B.QuestionID
I've removed a lot of stuff from this query, there are many other columns but those aren't important.
Yes, it's stupid that I have to use the AnswerID to key against another table, but need to at the same time randomize this ID so the display order is different for each session. No, we can't do it in the front end. It's Flash and even worse than this garbage.
Here are the problems:
@Upper isn't actually a variable. It has to be counted from the table for each set of values that are keyed by their QuestionID. So...there could be data like this
QuestionID AnswerID 2 1 2 2 23 31 32 33 34
For both sets of rows, QuestionID 2 and QuestionID 3, the upper bound of the random value is different: 3 for QuestionID 2, and 4 for QuestionID 3. I guess this sounds like a nested select but I'm not sure how to write this. Usually there will be about 50 questions, so we're looking at up to 200 or so rows.
I don't think RAND() returns a different value for every row, only once for the query. What are the alternatives?
Any problems with using error_log with parameter 1 (email) that I should be aware of? I just discovered it while researching something else and would like to abuse the hell out of it for production website. It would be nice to suppress all error messages and instead have them piped to a dedicated email address.
Other question... security. Any recommended reads, blogs, notes, what not on the subject? And I mean anything and everything.
Also, turns out I mistakingly posted this to another community by accident, so to save time... this is what I mean by error_log()
$link = mysql_connect("www.myServer.com","myAccount","password"); if($link == false) { $errMsg = "SITE: www.myServer.com\n"; $errMsg .= __FILE__ . "@" . __LINE__ ."\n"; $errMsg .= "Detail: Unable to connect to mysql server\n"; $errMsg .= "mysql_error: " . mysql_error() . "\n"; error_log($errMsg, 1, "errMsg@myOtherServer.com"); //then die or skip rest of script }
And while I am on the subject, is there anything equivalent to c/c++ #define. I really miss that and it would come in handy with something like the above code.
Sooo, I am 98% sure I'm taking on a client where I need to implement a CMS for him to make changes to the site if necessary. When I first wrote the quote a few months ago, I suggested Drupal as the CMS because it was free, looked rather customizable and somewhat easy to use. I have limited experience writing PHP and dealing with MySQL databases but understand the mechanics of it.
This particular client is offering subscriptions to a database of business industry contacts. There will be about 500 contacts on the pay list (including name, office, affiliation, etc) and he will add to the list about every three months or so. What he will change on a regular monthly basis will be the articles he writes. The subscription concept then requires a shopping cart/check out system as well as a password/username management system for each user.
Would Drupal be a good way to go for both my client and for me? Also, I'm switching web hosts and am considering becoming a reseller. Is there a web host that has it installed already? Door Host ( http://www.doorhost.net/ ) used to have it, but I don't think they do anymore. I'm considering going with Total Choice Hosting buying either their basic reseller plan or the regular Deluxe Plan.
Nonetheless, who has used this or other comparable CMSs and which do you prefer and why?
Oh, I just found this Drupal web hosting company called Site Ground that has all of the major CMSs installed already and gives you 24,000 MB of space. Anyone heard of them? I also found these guys: Canaca.com.
I asked about how to speed up a query using many NOT LIKE %45% statements. I am starting to understand what I need to do, thanks to some advice. Since I am a newbie sql wannabee, it will take me a while. I am still not there yet, I think the best plan is to make another column tinyint, and mark the records I do not want to show. I am not quite sure of they syntax and how to do "if this, insert this data into the records column tinyint" so that I use that as a marker of a record NOT to show. I would figure I do that every once a month or so to mark all the records I don't want to show during a query. Any other ideas? Maybe it's easier to just filter them in the php web code?
One thing I did find out was that mysql had query_cache_size = 0. I set it to 80 megabytes and now any duplicate query is instant!! Nice. . .I was wondering why it couldn't cache duplicate searchs and some other things like that.
There's been plenty of debate in-office about this.
We have a SQL-head that writes long, long, looong sql statements in order to process information. On the other hand, another developer believes that sql statements should be short and the code should process that information separately. Both argue their way to be the more 'efficient' of the two.
Sorry to be posting again with nothing to actually contribute, but I'm having a hacker issue with one of my websites. I'm not one to jump on sudden suspicions of hackers, and I don't victimize myself, but this is the second time someone's hacked my site.
After the first time, I was extremely cautious. I uploaded my site to a new server and made sure not to install any interactive PHP scripts. I did, however, continue to code my website in basic PHP, but nothing that required a connection with an SQL database or any sort of log in - just simple PHP pages with dynamic inclusion and switch functions.
[/END SOB STORY]
My friend's webhost (my friend was hosting me at the time) sent this as a response to my e-mail:
"Do NOT put any php pages back up on this site if you wish to host it with us and certainly not any phpbb boards which were most likely used in the attempt to hack our server."
Considering I did not have a phpBB script uploaded, the only alternative is that he hacked my site through my actual php pages (at least according to her webhost).
So my question is this: What are the security risks/vulnerabilities of just normal, non-interactive php pages?
And if anyone can provide any security tips, that would be greatly appreciated!
The good news is, I'm working on a simple gallery script that does not require an SQL connection or anything, so hopefully I'll be able to post that soon! :)
EDIT Here's the code I've been using for the main page/subpages. include("language.php");
Phase 1 Basic webform with Javascript helper script that isn't required.
Phase 2 Collect user information for sale from previous form, validate/normalize the information (anti-everything but alphanumeric except ' and -), insert information into DB with pending flag.
Embed record ID into hidden form field Present user with sales choice (invoice, electr invoice, credit).
Phase 3 Grab the embedded form field, validate ID by "x = (int) y;", and compare REMOTE_ADDR to the DB Record, and update record via a super-limited DB user account ( it only has privileges set by MySQL to update "accountInfo.payMethod" and select "accountInfo.ID" and "accountInfo.IP"
Display confirmation page of their order then either jump to the CCard service and out of my responsiblity.
I can't see how someone could break this system as it stands, but I am tempted to do the entire thing with Session handling, passing the SID to the second form and retrieving it on the third to get $_SESSION['myID'] just to make it even harder to break.
I have these files generated by a perl script, called something.hwd.
Format of each file is the title separated by a pipe symbol and then the html of the article. Example:
This is the title | Some html about the article goes here
Does anyone know how to mass import about 500 of these files into MySQL as records in a single table? I was going to write a script that would grab the date and time stamp of each file and then make the file format like this
2006-06-06 | Title | The HTML
I almost had it working but some records were skipped, I was using mysql's import command.
Anyway, if you can't figure it out, no problem. The client is willing to start with a fresh website.
ok i've picked a new web host. i googled this but couldn't make it give me an answer. i would like to transfer emails that are stored on my current host's server to the new host and make them viewable thru the new host's webmail. is this kind of thing possible? would it depend on the web host and the email program they use?
also i would like to transfer my message board as well. would it be as simple as downloading the mysql database and uploading it to the new server?
I recently got to see a demo of the next version of ADO.Net by Shyam Pather introducing some pretty drastic changes from the current version. They are maintaining backwards compatibility, however they are adding a lot of new features which I think will alter the way data-access with .Net is done in pretty significant ways. There's a fairly well integrated support for more of an ORM approach using a tweaked SQL dialect, for example. Also, there's upcoming support for LINQ...
In Visual Studio.NET Pro, my solution has two projects -- one WebSite project, and one SQL Server project. I want to move the two projects into different physical directories on my HDD, and preserve the same solution. When I do that, the Solution complains that it can't find the projects (duh). If I try to "Add Existing Items", then I am able to add the SQL Server project, but Website project does not have a file like ".csproj", therefore adding them again after moving them doesn't seem to work. Could someone suggest how can I move the projects into different folder while preserving my solution?
(This annoying stuff has to be done, because AnkhSVN does not work unless projects are in the solution directory, grrr).
Follow-up: I moved the projects, and edited the addresses in the .sln file. Now AnkhSVN is able to export the SQL Server project to the Subversion repository, but it still refuses to export the WebSite project, complaining that it is not under the solution root directory. I've already put it in the same folder as the sln file, it still complains. Do you know why?
I've been thinking about an anti-flood mechanism for php that doesn't involve some sort of captcha method. I looked into using shared memory but its back to the same problem of having a race condition between scripts. As a temporary work around I guess I could just use a MySQL table, but I was hoping for something with a smaller memory and speed foot print. Anyone got any idea's?
Example of what I mean:
If 10000 hits are made in less then 30 minutes to a php script, the script has a switch at the beginning that immediately dies().
I thought of a cron script that runs every 30 minutes and counts the number of new lines from an offset in the record then adds "deny from $userIP" under the Order deny,allow directive of .htaccess, but was wondering if there is anything else before I start making a bunch of regEx's to parse the access log.
I have been searching google for a php calendar for a few days now, and haven't found one that I like. I specifically looking for one that just lists the dates of important events not an actual calendar. I've looked on codegrrl.com but I couldn't find the calendar my friend Crystal used. If you have any suggestions please post them!
Or if you just created a php calendar and want someone to try it out then that's fine too.
Oh by the way, I'm pretty new at PHP so I want a simple calendar that is easy to install also my server doesn't support MYSQL which can be a problem!
Five reasons, without the satire, for using PostgreSQL open source database.
Availability on multiple platforms
PostgreSQL is available on every modern Unix-compatible operating system, windows, and ports are also available for Novell NetWare and OS/2.
Professional development and administration tools
The following list summarizes just a few of the tools available to PostgreSQL developers:
Database modeling: Several commercial and open [...]